Jan 15 23:45:01.763343 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 15 23:45:01.763365 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Jan 15 22:06:59 -00 2026 Jan 15 23:45:01.763375 kernel: KASLR enabled Jan 15 23:45:01.763381 kernel: efi: EFI v2.7 by EDK II Jan 15 23:45:01.763386 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 15 23:45:01.763391 kernel: random: crng init done Jan 15 23:45:01.763398 kernel: secureboot: Secure boot disabled Jan 15 23:45:01.763403 kernel: ACPI: Early table checksum verification disabled Jan 15 23:45:01.763409 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 15 23:45:01.763415 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 15 23:45:01.763422 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763428 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763448 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763458 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763465 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763472 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763481 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763487 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763493 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763499 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 23:45:01.763505 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 15 23:45:01.763511 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 15 23:45:01.763517 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 15 23:45:01.763523 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 15 23:45:01.763529 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 15 23:45:01.763535 kernel: Zone ranges: Jan 15 23:45:01.763542 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 15 23:45:01.763548 kernel: DMA32 empty Jan 15 23:45:01.763554 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 15 23:45:01.763560 kernel: Device empty Jan 15 23:45:01.763566 kernel: Movable zone start for each node Jan 15 23:45:01.763572 kernel: Early memory node ranges Jan 15 23:45:01.763578 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 15 23:45:01.763584 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 15 23:45:01.763590 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 15 23:45:01.763596 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 15 23:45:01.763602 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 15 23:45:01.763609 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 15 23:45:01.763616 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 15 23:45:01.763622 kernel: psci: probing for conduit method from ACPI. Jan 15 23:45:01.763631 kernel: psci: PSCIv1.3 detected in firmware. Jan 15 23:45:01.763638 kernel: psci: Using standard PSCI v0.2 function IDs Jan 15 23:45:01.763644 kernel: psci: Trusted OS migration not required Jan 15 23:45:01.763651 kernel: psci: SMC Calling Convention v1.1 Jan 15 23:45:01.763658 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 15 23:45:01.763664 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 15 23:45:01.763671 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 15 23:45:01.763677 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 15 23:45:01.763683 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 15 23:45:01.763690 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 15 23:45:01.763696 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 15 23:45:01.763703 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 15 23:45:01.763709 kernel: Detected PIPT I-cache on CPU0 Jan 15 23:45:01.763715 kernel: CPU features: detected: GIC system register CPU interface Jan 15 23:45:01.763722 kernel: CPU features: detected: Spectre-v4 Jan 15 23:45:01.763729 kernel: CPU features: detected: Spectre-BHB Jan 15 23:45:01.763736 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 15 23:45:01.763742 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 15 23:45:01.763749 kernel: CPU features: detected: ARM erratum 1418040 Jan 15 23:45:01.763755 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 15 23:45:01.763761 kernel: alternatives: applying boot alternatives Jan 15 23:45:01.763769 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=83f7d443283b2e87b6283ab8b3252eb2d2356b218981a63efeb3e370fba6f971 Jan 15 23:45:01.763776 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 15 23:45:01.763782 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 15 23:45:01.763788 kernel: Fallback order for Node 0: 0 Jan 15 23:45:01.763796 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 15 23:45:01.763802 kernel: Policy zone: Normal Jan 15 23:45:01.763809 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 15 23:45:01.763815 kernel: software IO TLB: area num 4. Jan 15 23:45:01.763822 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 15 23:45:01.763828 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 15 23:45:01.763834 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 15 23:45:01.763841 kernel: rcu: RCU event tracing is enabled. Jan 15 23:45:01.763848 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 15 23:45:01.763854 kernel: Trampoline variant of Tasks RCU enabled. Jan 15 23:45:01.763861 kernel: Tracing variant of Tasks RCU enabled. Jan 15 23:45:01.763868 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 15 23:45:01.763875 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 15 23:45:01.763882 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 15 23:45:01.763888 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 15 23:45:01.763895 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 15 23:45:01.763901 kernel: GICv3: 256 SPIs implemented Jan 15 23:45:01.763907 kernel: GICv3: 0 Extended SPIs implemented Jan 15 23:45:01.763913 kernel: Root IRQ handler: gic_handle_irq Jan 15 23:45:01.763920 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 15 23:45:01.763926 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 15 23:45:01.763933 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 15 23:45:01.763939 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 15 23:45:01.763946 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 15 23:45:01.763954 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 15 23:45:01.763960 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 15 23:45:01.763967 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 15 23:45:01.763973 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 15 23:45:01.763979 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 23:45:01.763986 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 15 23:45:01.763992 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 15 23:45:01.763999 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 15 23:45:01.764005 kernel: arm-pv: using stolen time PV Jan 15 23:45:01.764012 kernel: Console: colour dummy device 80x25 Jan 15 23:45:01.764020 kernel: ACPI: Core revision 20240827 Jan 15 23:45:01.764027 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 15 23:45:01.764034 kernel: pid_max: default: 32768 minimum: 301 Jan 15 23:45:01.764040 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 15 23:45:01.764047 kernel: landlock: Up and running. Jan 15 23:45:01.764053 kernel: SELinux: Initializing. Jan 15 23:45:01.764060 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 23:45:01.764066 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 23:45:01.764073 kernel: rcu: Hierarchical SRCU implementation. Jan 15 23:45:01.764079 kernel: rcu: Max phase no-delay instances is 400. Jan 15 23:45:01.764087 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 15 23:45:01.764094 kernel: Remapping and enabling EFI services. Jan 15 23:45:01.764101 kernel: smp: Bringing up secondary CPUs ... Jan 15 23:45:01.764107 kernel: Detected PIPT I-cache on CPU1 Jan 15 23:45:01.764114 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 15 23:45:01.764121 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 15 23:45:01.764127 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 23:45:01.764134 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 15 23:45:01.764140 kernel: Detected PIPT I-cache on CPU2 Jan 15 23:45:01.764153 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 15 23:45:01.764160 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 15 23:45:01.764167 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 23:45:01.764175 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 15 23:45:01.764182 kernel: Detected PIPT I-cache on CPU3 Jan 15 23:45:01.764189 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 15 23:45:01.764196 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 15 23:45:01.764203 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 23:45:01.764213 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 15 23:45:01.764220 kernel: smp: Brought up 1 node, 4 CPUs Jan 15 23:45:01.764227 kernel: SMP: Total of 4 processors activated. Jan 15 23:45:01.764234 kernel: CPU: All CPU(s) started at EL1 Jan 15 23:45:01.764240 kernel: CPU features: detected: 32-bit EL0 Support Jan 15 23:45:01.764247 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 15 23:45:01.764254 kernel: CPU features: detected: Common not Private translations Jan 15 23:45:01.764261 kernel: CPU features: detected: CRC32 instructions Jan 15 23:45:01.764268 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 15 23:45:01.764276 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 15 23:45:01.764283 kernel: CPU features: detected: LSE atomic instructions Jan 15 23:45:01.764290 kernel: CPU features: detected: Privileged Access Never Jan 15 23:45:01.764297 kernel: CPU features: detected: RAS Extension Support Jan 15 23:45:01.764304 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 15 23:45:01.764311 kernel: alternatives: applying system-wide alternatives Jan 15 23:45:01.764318 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 15 23:45:01.764325 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Jan 15 23:45:01.764332 kernel: devtmpfs: initialized Jan 15 23:45:01.764340 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 15 23:45:01.764347 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 15 23:45:01.764354 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 15 23:45:01.764361 kernel: 0 pages in range for non-PLT usage Jan 15 23:45:01.764368 kernel: 508400 pages in range for PLT usage Jan 15 23:45:01.764375 kernel: pinctrl core: initialized pinctrl subsystem Jan 15 23:45:01.764382 kernel: SMBIOS 3.0.0 present. Jan 15 23:45:01.764389 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 15 23:45:01.764395 kernel: DMI: Memory slots populated: 1/1 Jan 15 23:45:01.764404 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 15 23:45:01.764411 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 15 23:45:01.764418 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 15 23:45:01.764425 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 15 23:45:01.764432 kernel: audit: initializing netlink subsys (disabled) Jan 15 23:45:01.764445 kernel: audit: type=2000 audit(0.041:1): state=initialized audit_enabled=0 res=1 Jan 15 23:45:01.764452 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 15 23:45:01.764459 kernel: cpuidle: using governor menu Jan 15 23:45:01.764466 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 15 23:45:01.764475 kernel: ASID allocator initialised with 32768 entries Jan 15 23:45:01.764482 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 15 23:45:01.764489 kernel: Serial: AMBA PL011 UART driver Jan 15 23:45:01.764496 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 15 23:45:01.764503 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 15 23:45:01.764510 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 15 23:45:01.764516 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 15 23:45:01.764523 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 15 23:45:01.764530 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 15 23:45:01.764538 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 15 23:45:01.764545 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 15 23:45:01.764552 kernel: ACPI: Added _OSI(Module Device) Jan 15 23:45:01.764559 kernel: ACPI: Added _OSI(Processor Device) Jan 15 23:45:01.764566 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 15 23:45:01.764573 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 15 23:45:01.764580 kernel: ACPI: Interpreter enabled Jan 15 23:45:01.764587 kernel: ACPI: Using GIC for interrupt routing Jan 15 23:45:01.764593 kernel: ACPI: MCFG table detected, 1 entries Jan 15 23:45:01.764602 kernel: ACPI: CPU0 has been hot-added Jan 15 23:45:01.764608 kernel: ACPI: CPU1 has been hot-added Jan 15 23:45:01.764615 kernel: ACPI: CPU2 has been hot-added Jan 15 23:45:01.764622 kernel: ACPI: CPU3 has been hot-added Jan 15 23:45:01.764629 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 15 23:45:01.764636 kernel: printk: legacy console [ttyAMA0] enabled Jan 15 23:45:01.764643 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 15 23:45:01.764774 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 15 23:45:01.764841 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 15 23:45:01.764898 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 15 23:45:01.764955 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 15 23:45:01.765010 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 15 23:45:01.765019 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 15 23:45:01.765026 kernel: PCI host bridge to bus 0000:00 Jan 15 23:45:01.765090 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 15 23:45:01.765144 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 15 23:45:01.765197 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 15 23:45:01.765248 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 15 23:45:01.765323 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 15 23:45:01.765392 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.765474 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 15 23:45:01.765536 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 15 23:45:01.765597 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 15 23:45:01.765656 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 15 23:45:01.765724 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.765782 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 15 23:45:01.765839 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 15 23:45:01.765897 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 15 23:45:01.765967 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.766044 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 15 23:45:01.766104 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 15 23:45:01.766161 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 15 23:45:01.766218 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 15 23:45:01.766283 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.766343 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 15 23:45:01.766401 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 15 23:45:01.766481 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 15 23:45:01.766547 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.766606 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 15 23:45:01.766663 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 15 23:45:01.766720 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 15 23:45:01.766777 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 15 23:45:01.766844 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.766906 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 15 23:45:01.766963 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 15 23:45:01.767021 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 15 23:45:01.767078 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 15 23:45:01.767141 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.767200 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 15 23:45:01.767257 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 15 23:45:01.767325 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.767383 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 15 23:45:01.767456 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 15 23:45:01.767526 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.767587 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 15 23:45:01.767644 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 15 23:45:01.767707 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.767767 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 15 23:45:01.767824 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 15 23:45:01.767893 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.767952 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 15 23:45:01.768009 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 15 23:45:01.768073 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.768132 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 15 23:45:01.768189 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 15 23:45:01.768254 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.768313 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 15 23:45:01.768370 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 15 23:45:01.768441 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.768504 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 15 23:45:01.768562 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 15 23:45:01.768627 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.768685 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 15 23:45:01.768742 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 15 23:45:01.768806 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.768863 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 15 23:45:01.768923 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 15 23:45:01.768986 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.769044 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 15 23:45:01.769101 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 15 23:45:01.769164 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.769221 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 15 23:45:01.769279 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 15 23:45:01.769338 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 15 23:45:01.769395 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 15 23:45:01.769471 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.769532 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 15 23:45:01.769591 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 15 23:45:01.769654 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 15 23:45:01.769716 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 15 23:45:01.769797 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.769860 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 15 23:45:01.769923 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 15 23:45:01.769980 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 15 23:45:01.770050 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 15 23:45:01.770149 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.770210 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 15 23:45:01.770272 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 15 23:45:01.770329 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 15 23:45:01.770386 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 15 23:45:01.770460 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.770520 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 15 23:45:01.770578 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 15 23:45:01.770635 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 15 23:45:01.770692 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 15 23:45:01.770758 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.770816 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 15 23:45:01.770876 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 15 23:45:01.770934 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 15 23:45:01.770990 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 15 23:45:01.771053 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.771110 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 15 23:45:01.771171 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 15 23:45:01.771228 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 15 23:45:01.771286 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 15 23:45:01.771349 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.771407 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 15 23:45:01.771472 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 15 23:45:01.771531 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 15 23:45:01.771592 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 15 23:45:01.771658 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.771715 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 15 23:45:01.771772 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 15 23:45:01.771829 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 15 23:45:01.771885 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 15 23:45:01.771950 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.772010 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 15 23:45:01.772067 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 15 23:45:01.772124 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 15 23:45:01.772180 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 15 23:45:01.772244 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.772302 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 15 23:45:01.772360 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 15 23:45:01.772418 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 15 23:45:01.772483 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 15 23:45:01.772554 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.772612 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 15 23:45:01.772672 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 15 23:45:01.772730 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 15 23:45:01.772787 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 15 23:45:01.772857 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.772928 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 15 23:45:01.772990 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 15 23:45:01.773047 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 15 23:45:01.773104 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 15 23:45:01.773170 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.773228 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 15 23:45:01.773285 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 15 23:45:01.773342 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 15 23:45:01.773402 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 15 23:45:01.773474 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.773534 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 15 23:45:01.773595 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 15 23:45:01.773652 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 15 23:45:01.773709 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 15 23:45:01.773772 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 23:45:01.773831 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 15 23:45:01.773888 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 15 23:45:01.773945 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 15 23:45:01.774014 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 15 23:45:01.774087 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 15 23:45:01.774148 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 15 23:45:01.774209 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 15 23:45:01.774268 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 15 23:45:01.774335 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 15 23:45:01.774394 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 15 23:45:01.774480 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 15 23:45:01.774544 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 15 23:45:01.774606 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 15 23:45:01.774683 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 15 23:45:01.774748 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 15 23:45:01.774819 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 15 23:45:01.774888 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 15 23:45:01.774951 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 15 23:45:01.775031 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 15 23:45:01.775095 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 15 23:45:01.775158 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 15 23:45:01.775218 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 15 23:45:01.775282 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 15 23:45:01.775340 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 15 23:45:01.775402 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 15 23:45:01.775470 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 15 23:45:01.775529 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 15 23:45:01.775592 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 15 23:45:01.775652 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 15 23:45:01.775709 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 15 23:45:01.775770 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 15 23:45:01.775829 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 15 23:45:01.775887 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 15 23:45:01.775948 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 15 23:45:01.776008 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 15 23:45:01.776073 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 15 23:45:01.776135 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 15 23:45:01.776194 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 15 23:45:01.776252 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 15 23:45:01.776313 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 15 23:45:01.776371 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 15 23:45:01.776442 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 15 23:45:01.776517 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 15 23:45:01.776578 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 15 23:45:01.776636 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 15 23:45:01.776698 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 15 23:45:01.776756 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 15 23:45:01.776814 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 15 23:45:01.776878 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 15 23:45:01.776936 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 15 23:45:01.776994 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 15 23:45:01.777055 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 15 23:45:01.777113 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 15 23:45:01.777171 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 15 23:45:01.777237 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 15 23:45:01.777298 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 15 23:45:01.777359 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 15 23:45:01.777420 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 15 23:45:01.777488 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 15 23:45:01.777547 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 15 23:45:01.777608 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 15 23:45:01.777668 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 15 23:45:01.777726 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 15 23:45:01.777787 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 15 23:45:01.777845 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 15 23:45:01.777902 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 15 23:45:01.777964 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 15 23:45:01.778035 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 15 23:45:01.778097 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 15 23:45:01.778159 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 15 23:45:01.778217 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 15 23:45:01.778274 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 15 23:45:01.778337 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 15 23:45:01.778395 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 15 23:45:01.778474 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 15 23:45:01.778536 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 15 23:45:01.778595 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 15 23:45:01.778655 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 15 23:45:01.778717 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 15 23:45:01.778776 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 15 23:45:01.778834 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 15 23:45:01.778897 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 15 23:45:01.778956 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 15 23:45:01.779013 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 15 23:45:01.779075 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 15 23:45:01.779134 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 15 23:45:01.779192 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 15 23:45:01.779253 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 15 23:45:01.779314 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 15 23:45:01.779372 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 15 23:45:01.779439 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 15 23:45:01.779501 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 15 23:45:01.779559 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 15 23:45:01.779619 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 15 23:45:01.779678 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 15 23:45:01.779738 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 15 23:45:01.779799 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 15 23:45:01.779858 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 15 23:45:01.779917 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 15 23:45:01.779980 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 15 23:45:01.780039 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 15 23:45:01.780098 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 15 23:45:01.780160 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 15 23:45:01.780220 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 15 23:45:01.780278 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 15 23:45:01.780340 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 15 23:45:01.780401 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 15 23:45:01.780475 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 15 23:45:01.780546 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 15 23:45:01.780605 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 15 23:45:01.780665 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 15 23:45:01.780726 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 15 23:45:01.780786 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 15 23:45:01.780844 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 15 23:45:01.780907 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 15 23:45:01.780968 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 15 23:45:01.781026 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 15 23:45:01.781087 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 15 23:45:01.781145 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 15 23:45:01.781204 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 15 23:45:01.781264 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 15 23:45:01.781323 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 15 23:45:01.781384 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 15 23:45:01.781456 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 15 23:45:01.781519 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 15 23:45:01.781577 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 15 23:45:01.781638 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 15 23:45:01.781696 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 15 23:45:01.781756 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 15 23:45:01.781818 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 15 23:45:01.781879 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 15 23:45:01.781937 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 15 23:45:01.782008 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 15 23:45:01.782071 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 15 23:45:01.782132 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 15 23:45:01.782191 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 15 23:45:01.782251 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 15 23:45:01.782312 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 15 23:45:01.782372 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 15 23:45:01.782430 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 15 23:45:01.782504 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 15 23:45:01.782568 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 15 23:45:01.782629 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 15 23:45:01.782688 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 15 23:45:01.782749 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 15 23:45:01.782810 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 15 23:45:01.782870 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 15 23:45:01.782929 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 15 23:45:01.782989 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 15 23:45:01.783047 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 15 23:45:01.783109 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 15 23:45:01.783168 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 15 23:45:01.783229 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 15 23:45:01.783290 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 15 23:45:01.783351 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 15 23:45:01.783409 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 15 23:45:01.783485 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 15 23:45:01.783545 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 15 23:45:01.783605 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 15 23:45:01.783664 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 15 23:45:01.783728 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 15 23:45:01.783788 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 15 23:45:01.783850 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 15 23:45:01.783908 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 15 23:45:01.783969 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 15 23:45:01.784029 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 15 23:45:01.784091 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 15 23:45:01.784151 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 15 23:45:01.784216 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 15 23:45:01.784276 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 15 23:45:01.784337 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 15 23:45:01.784395 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 15 23:45:01.784463 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 15 23:45:01.784523 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 15 23:45:01.784584 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 15 23:45:01.784642 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 15 23:45:01.784707 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 15 23:45:01.784767 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 15 23:45:01.784829 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 15 23:45:01.784888 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 15 23:45:01.784951 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 15 23:45:01.785009 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 15 23:45:01.785070 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 15 23:45:01.785128 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 15 23:45:01.785191 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 15 23:45:01.785250 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 15 23:45:01.785310 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 15 23:45:01.785368 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 15 23:45:01.785430 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 15 23:45:01.785502 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 15 23:45:01.785562 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 15 23:45:01.785620 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 15 23:45:01.785682 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 15 23:45:01.785741 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 15 23:45:01.785801 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 15 23:45:01.785858 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 15 23:45:01.785918 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 15 23:45:01.785988 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 15 23:45:01.786067 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 15 23:45:01.786128 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 15 23:45:01.786191 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 15 23:45:01.786253 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 15 23:45:01.786313 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 15 23:45:01.786374 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 15 23:45:01.786448 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 15 23:45:01.786511 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 15 23:45:01.786571 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 15 23:45:01.786629 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 15 23:45:01.786692 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 15 23:45:01.786751 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 15 23:45:01.786810 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 15 23:45:01.786877 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 15 23:45:01.786940 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 15 23:45:01.786999 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 15 23:45:01.787059 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 15 23:45:01.787120 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 15 23:45:01.787180 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 15 23:45:01.787238 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.787296 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.787355 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 15 23:45:01.787414 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.787496 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.787560 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 15 23:45:01.787618 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.787677 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.787737 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 15 23:45:01.787797 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.787857 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.787921 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 15 23:45:01.787981 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788039 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788099 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 15 23:45:01.788157 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788215 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788275 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 15 23:45:01.788332 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788393 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788467 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 15 23:45:01.788527 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788585 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788645 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 15 23:45:01.788703 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788760 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788819 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 15 23:45:01.788880 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.788939 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.788999 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 15 23:45:01.789057 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.789115 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.789175 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 15 23:45:01.789233 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.789290 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.789353 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 15 23:45:01.789411 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.789477 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.789537 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 15 23:45:01.789595 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.789653 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.789714 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 15 23:45:01.789773 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.789834 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.789895 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 15 23:45:01.789955 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.790025 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.790087 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 15 23:45:01.790146 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.790204 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.790265 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 15 23:45:01.790326 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.790385 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.790956 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 15 23:45:01.791055 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 15 23:45:01.791118 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 15 23:45:01.791185 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 15 23:45:01.791246 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 15 23:45:01.791308 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 15 23:45:01.791369 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 15 23:45:01.791429 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 15 23:45:01.791518 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 15 23:45:01.791581 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 15 23:45:01.791642 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 15 23:45:01.791702 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 15 23:45:01.791767 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 15 23:45:01.791829 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 15 23:45:01.791890 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 15 23:45:01.791953 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792013 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792076 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792135 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792196 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792254 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792314 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792372 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792431 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792512 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792574 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792633 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792693 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792751 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792812 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792871 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.792931 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.792991 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793051 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793109 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793171 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793230 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793290 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793349 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793420 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793507 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793570 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793630 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793692 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793751 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793811 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.793875 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.793936 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.794007 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.794079 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 23:45:01.794154 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 15 23:45:01.794224 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 15 23:45:01.794288 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 15 23:45:01.794357 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 15 23:45:01.794418 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 15 23:45:01.794492 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 15 23:45:01.794552 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 15 23:45:01.794616 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 15 23:45:01.794676 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 15 23:45:01.794740 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 15 23:45:01.794800 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 15 23:45:01.794866 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 15 23:45:01.794927 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 15 23:45:01.794986 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 15 23:45:01.795044 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 15 23:45:01.795103 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 15 23:45:01.795169 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 15 23:45:01.795231 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 15 23:45:01.795289 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 15 23:45:01.795347 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 15 23:45:01.795412 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 15 23:45:01.795517 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 15 23:45:01.795580 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 15 23:45:01.795639 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 15 23:45:01.795700 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 15 23:45:01.795766 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 15 23:45:01.795827 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 15 23:45:01.795885 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 15 23:45:01.795944 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 15 23:45:01.796002 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 15 23:45:01.796061 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 15 23:45:01.796119 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 15 23:45:01.796179 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 15 23:45:01.796238 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 15 23:45:01.796297 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 15 23:45:01.796355 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 15 23:45:01.796415 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 15 23:45:01.796484 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 15 23:45:01.796546 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 15 23:45:01.796607 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 15 23:45:01.796666 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 15 23:45:01.796724 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 15 23:45:01.796784 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 15 23:45:01.796842 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 15 23:45:01.796902 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 15 23:45:01.796964 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 15 23:45:01.797025 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 15 23:45:01.797084 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 15 23:45:01.797143 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 15 23:45:01.797202 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 15 23:45:01.797261 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 15 23:45:01.797324 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 15 23:45:01.797383 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 15 23:45:01.797449 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 15 23:45:01.797510 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 15 23:45:01.797570 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 15 23:45:01.797629 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 15 23:45:01.797692 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 15 23:45:01.797752 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 15 23:45:01.797811 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 15 23:45:01.797871 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 15 23:45:01.797929 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 15 23:45:01.797986 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 15 23:45:01.798059 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 15 23:45:01.798122 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 15 23:45:01.798180 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 15 23:45:01.798241 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 15 23:45:01.798301 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 15 23:45:01.798360 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 15 23:45:01.798417 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 15 23:45:01.798492 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 15 23:45:01.798552 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 15 23:45:01.798613 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 15 23:45:01.798671 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 15 23:45:01.798730 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 15 23:45:01.798788 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 15 23:45:01.798846 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 15 23:45:01.798904 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 15 23:45:01.798970 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 15 23:45:01.799029 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 15 23:45:01.799089 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 15 23:45:01.799147 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 15 23:45:01.799207 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 15 23:45:01.799265 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 15 23:45:01.799324 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 15 23:45:01.799381 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 15 23:45:01.799452 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 15 23:45:01.799512 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 15 23:45:01.799571 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 15 23:45:01.799631 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 15 23:45:01.799691 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 15 23:45:01.799749 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 15 23:45:01.799807 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 15 23:45:01.799867 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 15 23:45:01.799928 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 15 23:45:01.799986 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 15 23:45:01.800045 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 15 23:45:01.800104 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 15 23:45:01.800164 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 15 23:45:01.800223 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 15 23:45:01.800281 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 15 23:45:01.800338 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 15 23:45:01.800398 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 15 23:45:01.800470 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 15 23:45:01.800529 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 15 23:45:01.800590 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 15 23:45:01.800651 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 15 23:45:01.800709 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 15 23:45:01.800767 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 15 23:45:01.800824 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 15 23:45:01.800884 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 15 23:45:01.800942 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 15 23:45:01.801002 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 15 23:45:01.801062 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 15 23:45:01.801125 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 15 23:45:01.801183 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 15 23:45:01.801241 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 15 23:45:01.801299 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 15 23:45:01.801359 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 15 23:45:01.801418 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 15 23:45:01.801486 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 15 23:45:01.801544 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 15 23:45:01.801606 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 15 23:45:01.801664 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 15 23:45:01.801722 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 15 23:45:01.801780 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 15 23:45:01.801841 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 15 23:45:01.801894 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 15 23:45:01.801945 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 15 23:45:01.802024 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 15 23:45:01.802082 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 15 23:45:01.802143 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 15 23:45:01.802198 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 15 23:45:01.802259 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 15 23:45:01.802314 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 15 23:45:01.802380 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 15 23:45:01.802447 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 15 23:45:01.802511 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 15 23:45:01.802566 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 15 23:45:01.802628 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 15 23:45:01.802682 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 15 23:45:01.802749 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 15 23:45:01.802806 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 15 23:45:01.802868 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 15 23:45:01.802922 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 15 23:45:01.802983 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 15 23:45:01.803036 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 15 23:45:01.803097 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 15 23:45:01.803153 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 15 23:45:01.803218 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 15 23:45:01.803273 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 15 23:45:01.803333 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 15 23:45:01.803387 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 15 23:45:01.803460 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 15 23:45:01.803518 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 15 23:45:01.803581 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 15 23:45:01.803635 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 15 23:45:01.803695 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 15 23:45:01.803749 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 15 23:45:01.803814 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 15 23:45:01.803870 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 15 23:45:01.803932 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 15 23:45:01.803986 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 15 23:45:01.804046 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 15 23:45:01.804100 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 15 23:45:01.804162 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 15 23:45:01.804216 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 15 23:45:01.804269 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 15 23:45:01.804329 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 15 23:45:01.804384 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 15 23:45:01.804454 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 15 23:45:01.804520 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 15 23:45:01.804578 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 15 23:45:01.804633 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 15 23:45:01.804694 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 15 23:45:01.804749 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 15 23:45:01.804802 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 15 23:45:01.804864 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 15 23:45:01.804919 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 15 23:45:01.804976 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 15 23:45:01.805045 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 15 23:45:01.805100 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 15 23:45:01.805155 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 15 23:45:01.805215 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 15 23:45:01.805270 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 15 23:45:01.805323 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 15 23:45:01.805386 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 15 23:45:01.805455 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 15 23:45:01.805512 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 15 23:45:01.805574 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 15 23:45:01.805628 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 15 23:45:01.805682 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 15 23:45:01.805744 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 15 23:45:01.805799 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 15 23:45:01.805853 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 15 23:45:01.805913 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 15 23:45:01.805967 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 15 23:45:01.806045 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 15 23:45:01.806109 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 15 23:45:01.806166 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 15 23:45:01.806220 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 15 23:45:01.806280 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 15 23:45:01.806334 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 15 23:45:01.806387 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 15 23:45:01.806463 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 15 23:45:01.806520 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 15 23:45:01.806577 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 15 23:45:01.806642 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 15 23:45:01.806696 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 15 23:45:01.806750 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 15 23:45:01.806759 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 15 23:45:01.806767 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 15 23:45:01.806775 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 15 23:45:01.806784 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 15 23:45:01.806791 kernel: iommu: Default domain type: Translated Jan 15 23:45:01.806799 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 15 23:45:01.806806 kernel: efivars: Registered efivars operations Jan 15 23:45:01.806813 kernel: vgaarb: loaded Jan 15 23:45:01.806820 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 15 23:45:01.806828 kernel: VFS: Disk quotas dquot_6.6.0 Jan 15 23:45:01.806835 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 15 23:45:01.806843 kernel: pnp: PnP ACPI init Jan 15 23:45:01.806908 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 15 23:45:01.806920 kernel: pnp: PnP ACPI: found 1 devices Jan 15 23:45:01.806927 kernel: NET: Registered PF_INET protocol family Jan 15 23:45:01.806935 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 15 23:45:01.806943 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 15 23:45:01.806950 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 15 23:45:01.806958 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 15 23:45:01.806965 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 15 23:45:01.806973 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 15 23:45:01.806982 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 15 23:45:01.806989 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 15 23:45:01.806997 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 15 23:45:01.807061 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 15 23:45:01.807071 kernel: PCI: CLS 0 bytes, default 64 Jan 15 23:45:01.807079 kernel: kvm [1]: HYP mode not available Jan 15 23:45:01.807086 kernel: Initialise system trusted keyrings Jan 15 23:45:01.807094 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 15 23:45:01.807101 kernel: Key type asymmetric registered Jan 15 23:45:01.807110 kernel: Asymmetric key parser 'x509' registered Jan 15 23:45:01.807117 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 15 23:45:01.807125 kernel: io scheduler mq-deadline registered Jan 15 23:45:01.807132 kernel: io scheduler kyber registered Jan 15 23:45:01.807140 kernel: io scheduler bfq registered Jan 15 23:45:01.807148 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 15 23:45:01.807207 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 15 23:45:01.807267 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 15 23:45:01.807325 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.807388 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 15 23:45:01.807456 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 15 23:45:01.807517 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.807579 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 15 23:45:01.807638 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 15 23:45:01.807696 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.807757 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 15 23:45:01.807818 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 15 23:45:01.807876 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.807936 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 15 23:45:01.807995 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 15 23:45:01.808053 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.808113 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 15 23:45:01.808171 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 15 23:45:01.808229 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.808292 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 15 23:45:01.808351 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 15 23:45:01.808409 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.808477 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 15 23:45:01.808537 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 15 23:45:01.808597 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.808607 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 15 23:45:01.808665 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 15 23:45:01.808726 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 15 23:45:01.808785 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.808845 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 15 23:45:01.808905 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 15 23:45:01.808964 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809024 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 15 23:45:01.809082 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 15 23:45:01.809140 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809202 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 15 23:45:01.809261 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 15 23:45:01.809319 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809379 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 15 23:45:01.809451 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 15 23:45:01.809511 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809573 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 15 23:45:01.809633 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 15 23:45:01.809691 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809750 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 15 23:45:01.809808 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 15 23:45:01.809866 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.809926 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 15 23:45:01.809985 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 15 23:45:01.810057 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.810070 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 15 23:45:01.810128 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 15 23:45:01.810187 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 15 23:45:01.810245 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.810305 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 15 23:45:01.810363 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 15 23:45:01.810420 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.810490 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 15 23:45:01.810551 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 15 23:45:01.810610 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.810670 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 15 23:45:01.810728 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 15 23:45:01.810786 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.810846 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 15 23:45:01.810904 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 15 23:45:01.810962 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811024 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 15 23:45:01.811082 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 15 23:45:01.811140 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811200 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 15 23:45:01.811258 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 15 23:45:01.811316 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811376 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 15 23:45:01.811442 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 15 23:45:01.811503 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811512 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 15 23:45:01.811571 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 15 23:45:01.811630 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 15 23:45:01.811688 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811749 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 15 23:45:01.811807 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 15 23:45:01.811867 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.811927 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 15 23:45:01.811985 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 15 23:45:01.812043 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.812103 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 15 23:45:01.812162 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 15 23:45:01.812220 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.812280 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 15 23:45:01.812341 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 15 23:45:01.812398 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.812473 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 15 23:45:01.812534 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 15 23:45:01.812592 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.812653 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 15 23:45:01.812712 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 15 23:45:01.812770 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.812833 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 15 23:45:01.812891 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 15 23:45:01.812949 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.813010 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 15 23:45:01.813067 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 15 23:45:01.813125 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 23:45:01.813135 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 15 23:45:01.813144 kernel: ACPI: button: Power Button [PWRB] Jan 15 23:45:01.813207 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 15 23:45:01.813273 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 15 23:45:01.813283 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 15 23:45:01.813290 kernel: thunder_xcv, ver 1.0 Jan 15 23:45:01.813297 kernel: thunder_bgx, ver 1.0 Jan 15 23:45:01.813305 kernel: nicpf, ver 1.0 Jan 15 23:45:01.813312 kernel: nicvf, ver 1.0 Jan 15 23:45:01.813381 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 15 23:45:01.813452 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-15T23:45:01 UTC (1768520701) Jan 15 23:45:01.813463 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 15 23:45:01.813471 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 15 23:45:01.813478 kernel: watchdog: NMI not fully supported Jan 15 23:45:01.813485 kernel: watchdog: Hard watchdog permanently disabled Jan 15 23:45:01.813493 kernel: NET: Registered PF_INET6 protocol family Jan 15 23:45:01.813500 kernel: Segment Routing with IPv6 Jan 15 23:45:01.813507 kernel: In-situ OAM (IOAM) with IPv6 Jan 15 23:45:01.813516 kernel: NET: Registered PF_PACKET protocol family Jan 15 23:45:01.813524 kernel: Key type dns_resolver registered Jan 15 23:45:01.813531 kernel: registered taskstats version 1 Jan 15 23:45:01.813539 kernel: Loading compiled-in X.509 certificates Jan 15 23:45:01.813547 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: b110dfc7e70ecac41e34f52a0c530f0543b60d51' Jan 15 23:45:01.813554 kernel: Demotion targets for Node 0: null Jan 15 23:45:01.813561 kernel: Key type .fscrypt registered Jan 15 23:45:01.813568 kernel: Key type fscrypt-provisioning registered Jan 15 23:45:01.813576 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 15 23:45:01.813583 kernel: ima: Allocated hash algorithm: sha1 Jan 15 23:45:01.813592 kernel: ima: No architecture policies found Jan 15 23:45:01.813599 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 15 23:45:01.813607 kernel: clk: Disabling unused clocks Jan 15 23:45:01.813614 kernel: PM: genpd: Disabling unused power domains Jan 15 23:45:01.813622 kernel: Warning: unable to open an initial console. Jan 15 23:45:01.813629 kernel: Freeing unused kernel memory: 39552K Jan 15 23:45:01.813637 kernel: Run /init as init process Jan 15 23:45:01.813644 kernel: with arguments: Jan 15 23:45:01.813651 kernel: /init Jan 15 23:45:01.813660 kernel: with environment: Jan 15 23:45:01.813667 kernel: HOME=/ Jan 15 23:45:01.813674 kernel: TERM=linux Jan 15 23:45:01.813683 systemd[1]: Successfully made /usr/ read-only. Jan 15 23:45:01.813693 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 23:45:01.813701 systemd[1]: Detected virtualization kvm. Jan 15 23:45:01.813709 systemd[1]: Detected architecture arm64. Jan 15 23:45:01.813717 systemd[1]: Running in initrd. Jan 15 23:45:01.813725 systemd[1]: No hostname configured, using default hostname. Jan 15 23:45:01.813733 systemd[1]: Hostname set to . Jan 15 23:45:01.813741 systemd[1]: Initializing machine ID from VM UUID. Jan 15 23:45:01.813748 systemd[1]: Queued start job for default target initrd.target. Jan 15 23:45:01.813756 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 23:45:01.813772 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 23:45:01.813783 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 15 23:45:01.813791 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 23:45:01.813799 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 15 23:45:01.813809 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 15 23:45:01.813818 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 15 23:45:01.813826 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 15 23:45:01.813834 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 23:45:01.813843 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 23:45:01.813851 systemd[1]: Reached target paths.target - Path Units. Jan 15 23:45:01.813859 systemd[1]: Reached target slices.target - Slice Units. Jan 15 23:45:01.813868 systemd[1]: Reached target swap.target - Swaps. Jan 15 23:45:01.813876 systemd[1]: Reached target timers.target - Timer Units. Jan 15 23:45:01.813884 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 23:45:01.813892 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 23:45:01.813900 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 15 23:45:01.813908 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 15 23:45:01.813916 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 23:45:01.813924 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 23:45:01.813932 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 23:45:01.813941 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 23:45:01.813950 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 15 23:45:01.813960 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 23:45:01.813968 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 15 23:45:01.813976 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 15 23:45:01.813984 systemd[1]: Starting systemd-fsck-usr.service... Jan 15 23:45:01.813992 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 23:45:01.814012 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 23:45:01.814021 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 23:45:01.814029 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 23:45:01.814038 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 15 23:45:01.814046 systemd[1]: Finished systemd-fsck-usr.service. Jan 15 23:45:01.814083 systemd-journald[312]: Collecting audit messages is disabled. Jan 15 23:45:01.814104 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 23:45:01.814112 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:01.814121 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 15 23:45:01.814130 kernel: Bridge firewalling registered Jan 15 23:45:01.814138 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 23:45:01.814146 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 23:45:01.814155 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 23:45:01.814163 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 23:45:01.814171 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 23:45:01.814180 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 23:45:01.814188 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 23:45:01.814198 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 23:45:01.814206 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 15 23:45:01.814215 systemd-journald[312]: Journal started Jan 15 23:45:01.814233 systemd-journald[312]: Runtime Journal (/run/log/journal/83774ded3dbc43478b42bd0dbc5602f8) is 8M, max 319.5M, 311.5M free. Jan 15 23:45:01.759093 systemd-modules-load[313]: Inserted module 'overlay' Jan 15 23:45:01.816993 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 23:45:01.773395 systemd-modules-load[313]: Inserted module 'br_netfilter' Jan 15 23:45:01.818940 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 23:45:01.828301 dracut-cmdline[342]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=83f7d443283b2e87b6283ab8b3252eb2d2356b218981a63efeb3e370fba6f971 Jan 15 23:45:01.838233 systemd-tmpfiles[348]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 15 23:45:01.842696 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 23:45:01.845079 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 23:45:01.883900 systemd-resolved[388]: Positive Trust Anchors: Jan 15 23:45:01.883919 systemd-resolved[388]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 23:45:01.883950 systemd-resolved[388]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 23:45:01.889362 systemd-resolved[388]: Defaulting to hostname 'linux'. Jan 15 23:45:01.890766 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 23:45:01.891663 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 23:45:01.901460 kernel: SCSI subsystem initialized Jan 15 23:45:01.906457 kernel: Loading iSCSI transport class v2.0-870. Jan 15 23:45:01.913458 kernel: iscsi: registered transport (tcp) Jan 15 23:45:01.926460 kernel: iscsi: registered transport (qla4xxx) Jan 15 23:45:01.926511 kernel: QLogic iSCSI HBA Driver Jan 15 23:45:01.943323 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 23:45:01.957361 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 23:45:01.959420 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 23:45:02.005004 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 15 23:45:02.007162 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 15 23:45:02.066471 kernel: raid6: neonx8 gen() 15728 MB/s Jan 15 23:45:02.083450 kernel: raid6: neonx4 gen() 15827 MB/s Jan 15 23:45:02.100451 kernel: raid6: neonx2 gen() 13262 MB/s Jan 15 23:45:02.117451 kernel: raid6: neonx1 gen() 10492 MB/s Jan 15 23:45:02.134477 kernel: raid6: int64x8 gen() 6912 MB/s Jan 15 23:45:02.151450 kernel: raid6: int64x4 gen() 7379 MB/s Jan 15 23:45:02.168482 kernel: raid6: int64x2 gen() 6111 MB/s Jan 15 23:45:02.185476 kernel: raid6: int64x1 gen() 5062 MB/s Jan 15 23:45:02.185531 kernel: raid6: using algorithm neonx4 gen() 15827 MB/s Jan 15 23:45:02.202464 kernel: raid6: .... xor() 12342 MB/s, rmw enabled Jan 15 23:45:02.202493 kernel: raid6: using neon recovery algorithm Jan 15 23:45:02.207554 kernel: xor: measuring software checksum speed Jan 15 23:45:02.207579 kernel: 8regs : 21618 MB/sec Jan 15 23:45:02.208666 kernel: 32regs : 21687 MB/sec Jan 15 23:45:02.208714 kernel: arm64_neon : 28089 MB/sec Jan 15 23:45:02.208747 kernel: xor: using function: arm64_neon (28089 MB/sec) Jan 15 23:45:02.261467 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 15 23:45:02.267991 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 15 23:45:02.270367 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 23:45:02.296972 systemd-udevd[564]: Using default interface naming scheme 'v255'. Jan 15 23:45:02.301084 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 23:45:02.303390 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 15 23:45:02.324298 dracut-pre-trigger[571]: rd.md=0: removing MD RAID activation Jan 15 23:45:02.348288 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 23:45:02.350560 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 23:45:02.423644 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 23:45:02.425800 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 15 23:45:02.465474 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 15 23:45:02.467592 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 15 23:45:02.478685 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 15 23:45:02.478732 kernel: GPT:17805311 != 104857599 Jan 15 23:45:02.479899 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 15 23:45:02.479934 kernel: GPT:17805311 != 104857599 Jan 15 23:45:02.480788 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 15 23:45:02.481873 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 23:45:02.488716 kernel: ACPI: bus type USB registered Jan 15 23:45:02.488768 kernel: usbcore: registered new interface driver usbfs Jan 15 23:45:02.489469 kernel: usbcore: registered new interface driver hub Jan 15 23:45:02.490894 kernel: usbcore: registered new device driver usb Jan 15 23:45:02.506987 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 15 23:45:02.507186 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 15 23:45:02.506756 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 23:45:02.513266 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 15 23:45:02.513428 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 15 23:45:02.513542 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 15 23:45:02.513619 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 15 23:45:02.506953 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:02.509662 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 23:45:02.513473 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 23:45:02.518902 kernel: hub 1-0:1.0: USB hub found Jan 15 23:45:02.519072 kernel: hub 1-0:1.0: 4 ports detected Jan 15 23:45:02.521989 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 15 23:45:02.522163 kernel: hub 2-0:1.0: USB hub found Jan 15 23:45:02.525457 kernel: hub 2-0:1.0: 4 ports detected Jan 15 23:45:02.535463 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:02.561152 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 15 23:45:02.563344 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 15 23:45:02.572162 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 15 23:45:02.580685 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 15 23:45:02.582787 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 15 23:45:02.590667 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 23:45:02.591661 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 23:45:02.593297 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 23:45:02.595145 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 23:45:02.597529 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 15 23:45:02.599138 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 15 23:45:02.611963 disk-uuid[661]: Primary Header is updated. Jan 15 23:45:02.611963 disk-uuid[661]: Secondary Entries is updated. Jan 15 23:45:02.611963 disk-uuid[661]: Secondary Header is updated. Jan 15 23:45:02.618130 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 15 23:45:02.620452 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 23:45:02.760486 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 15 23:45:02.891475 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 15 23:45:02.891545 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 15 23:45:02.891741 kernel: usbcore: registered new interface driver usbhid Jan 15 23:45:02.892540 kernel: usbhid: USB HID core driver Jan 15 23:45:02.997481 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 15 23:45:03.122465 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 15 23:45:03.174472 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 15 23:45:03.631466 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 23:45:03.631605 disk-uuid[664]: The operation has completed successfully. Jan 15 23:45:03.669831 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 15 23:45:03.669936 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 15 23:45:03.695945 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 15 23:45:03.712891 sh[684]: Success Jan 15 23:45:03.725483 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 15 23:45:03.725522 kernel: device-mapper: uevent: version 1.0.3 Jan 15 23:45:03.726464 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 15 23:45:03.733457 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 15 23:45:03.784704 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 15 23:45:03.786367 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 15 23:45:03.799466 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 15 23:45:03.813452 kernel: BTRFS: device fsid 4e574c26-9d5a-48bc-a727-ae12db8ee9fc devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (696) Jan 15 23:45:03.815961 kernel: BTRFS info (device dm-0): first mount of filesystem 4e574c26-9d5a-48bc-a727-ae12db8ee9fc Jan 15 23:45:03.815991 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 15 23:45:03.830457 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 15 23:45:03.830481 kernel: BTRFS info (device dm-0): enabling free space tree Jan 15 23:45:03.834753 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 15 23:45:03.835914 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 15 23:45:03.837036 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 15 23:45:03.837857 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 15 23:45:03.840559 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 15 23:45:03.873454 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (725) Jan 15 23:45:03.875954 kernel: BTRFS info (device vda6): first mount of filesystem c6a95867-5704-41e1-8beb-48e00b50aef1 Jan 15 23:45:03.876012 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 23:45:03.880488 kernel: BTRFS info (device vda6): turning on async discard Jan 15 23:45:03.880543 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 23:45:03.884456 kernel: BTRFS info (device vda6): last unmount of filesystem c6a95867-5704-41e1-8beb-48e00b50aef1 Jan 15 23:45:03.885334 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 15 23:45:03.887179 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 15 23:45:03.943288 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 23:45:03.946181 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 23:45:03.985929 systemd-networkd[866]: lo: Link UP Jan 15 23:45:03.985941 systemd-networkd[866]: lo: Gained carrier Jan 15 23:45:03.986889 systemd-networkd[866]: Enumeration completed Jan 15 23:45:03.987100 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 23:45:03.987327 systemd-networkd[866]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 23:45:03.987331 systemd-networkd[866]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 23:45:03.988143 systemd-networkd[866]: eth0: Link UP Jan 15 23:45:03.988229 systemd-networkd[866]: eth0: Gained carrier Jan 15 23:45:03.988238 systemd-networkd[866]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 23:45:03.989563 systemd[1]: Reached target network.target - Network. Jan 15 23:45:04.010495 systemd-networkd[866]: eth0: DHCPv4 address 10.0.10.219/25, gateway 10.0.10.129 acquired from 10.0.10.129 Jan 15 23:45:04.036505 ignition[787]: Ignition 2.22.0 Jan 15 23:45:04.036516 ignition[787]: Stage: fetch-offline Jan 15 23:45:04.036557 ignition[787]: no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:04.036564 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:04.036645 ignition[787]: parsed url from cmdline: "" Jan 15 23:45:04.036648 ignition[787]: no config URL provided Jan 15 23:45:04.036652 ignition[787]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 23:45:04.040470 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 23:45:04.036658 ignition[787]: no config at "/usr/lib/ignition/user.ign" Jan 15 23:45:04.042559 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 15 23:45:04.036663 ignition[787]: failed to fetch config: resource requires networking Jan 15 23:45:04.036830 ignition[787]: Ignition finished successfully Jan 15 23:45:04.080049 ignition[884]: Ignition 2.22.0 Jan 15 23:45:04.080064 ignition[884]: Stage: fetch Jan 15 23:45:04.080197 ignition[884]: no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:04.080206 ignition[884]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:04.080276 ignition[884]: parsed url from cmdline: "" Jan 15 23:45:04.080279 ignition[884]: no config URL provided Jan 15 23:45:04.080284 ignition[884]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 23:45:04.080290 ignition[884]: no config at "/usr/lib/ignition/user.ign" Jan 15 23:45:04.080794 ignition[884]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 15 23:45:04.080823 ignition[884]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 15 23:45:04.080828 ignition[884]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 15 23:45:04.837925 ignition[884]: GET result: OK Jan 15 23:45:04.838052 ignition[884]: parsing config with SHA512: 3085691c4eae23d7cc930fe6e149bddbdc6747270c42abb8e107971add085ef52178b6b23e02094c300ec2e23161d1f9f4cfcdd9fc5d3cb7f116f1fa4c836fba Jan 15 23:45:04.843203 unknown[884]: fetched base config from "system" Jan 15 23:45:04.843212 unknown[884]: fetched base config from "system" Jan 15 23:45:04.843559 ignition[884]: fetch: fetch complete Jan 15 23:45:04.843217 unknown[884]: fetched user config from "openstack" Jan 15 23:45:04.843563 ignition[884]: fetch: fetch passed Jan 15 23:45:04.845604 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 15 23:45:04.843601 ignition[884]: Ignition finished successfully Jan 15 23:45:04.847703 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 15 23:45:04.875200 ignition[892]: Ignition 2.22.0 Jan 15 23:45:04.875217 ignition[892]: Stage: kargs Jan 15 23:45:04.875353 ignition[892]: no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:04.875362 ignition[892]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:04.876059 ignition[892]: kargs: kargs passed Jan 15 23:45:04.876102 ignition[892]: Ignition finished successfully Jan 15 23:45:04.878594 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 15 23:45:04.880762 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 15 23:45:04.907808 ignition[900]: Ignition 2.22.0 Jan 15 23:45:04.907827 ignition[900]: Stage: disks Jan 15 23:45:04.907959 ignition[900]: no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:04.907968 ignition[900]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:04.911416 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 15 23:45:04.908692 ignition[900]: disks: disks passed Jan 15 23:45:04.912308 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 15 23:45:04.908737 ignition[900]: Ignition finished successfully Jan 15 23:45:04.913687 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 15 23:45:04.915049 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 23:45:04.916558 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 23:45:04.917935 systemd[1]: Reached target basic.target - Basic System. Jan 15 23:45:04.920280 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 15 23:45:04.957763 systemd-fsck[910]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jan 15 23:45:04.963961 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 15 23:45:04.966062 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 15 23:45:05.058490 kernel: EXT4-fs (vda9): mounted filesystem e775b4a8-7fa9-4c45-80b7-b5e0f0a5e4b9 r/w with ordered data mode. Quota mode: none. Jan 15 23:45:05.059393 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 15 23:45:05.060482 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 15 23:45:05.063020 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 23:45:05.064850 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 15 23:45:05.065683 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 15 23:45:05.066331 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 15 23:45:05.069424 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 15 23:45:05.069479 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 23:45:05.074854 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 15 23:45:05.076949 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 15 23:45:05.088451 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (918) Jan 15 23:45:05.090672 kernel: BTRFS info (device vda6): first mount of filesystem c6a95867-5704-41e1-8beb-48e00b50aef1 Jan 15 23:45:05.090736 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 23:45:05.096452 kernel: BTRFS info (device vda6): turning on async discard Jan 15 23:45:05.096509 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 23:45:05.097557 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 23:45:05.136702 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:05.156017 initrd-setup-root[946]: cut: /sysroot/etc/passwd: No such file or directory Jan 15 23:45:05.162385 initrd-setup-root[953]: cut: /sysroot/etc/group: No such file or directory Jan 15 23:45:05.167662 initrd-setup-root[960]: cut: /sysroot/etc/shadow: No such file or directory Jan 15 23:45:05.171722 initrd-setup-root[967]: cut: /sysroot/etc/gshadow: No such file or directory Jan 15 23:45:05.255642 systemd-networkd[866]: eth0: Gained IPv6LL Jan 15 23:45:05.263951 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 15 23:45:05.266133 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 15 23:45:05.267681 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 15 23:45:05.292678 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 15 23:45:05.294134 kernel: BTRFS info (device vda6): last unmount of filesystem c6a95867-5704-41e1-8beb-48e00b50aef1 Jan 15 23:45:05.311877 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 15 23:45:05.321941 ignition[1034]: INFO : Ignition 2.22.0 Jan 15 23:45:05.321941 ignition[1034]: INFO : Stage: mount Jan 15 23:45:05.323339 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:05.323339 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:05.323339 ignition[1034]: INFO : mount: mount passed Jan 15 23:45:05.323339 ignition[1034]: INFO : Ignition finished successfully Jan 15 23:45:05.324326 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 15 23:45:06.187519 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:08.194503 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:12.201491 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:12.209886 coreos-metadata[920]: Jan 15 23:45:12.209 WARN failed to locate config-drive, using the metadata service API instead Jan 15 23:45:12.226868 coreos-metadata[920]: Jan 15 23:45:12.226 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 23:45:12.836921 coreos-metadata[920]: Jan 15 23:45:12.836 INFO Fetch successful Jan 15 23:45:12.838126 coreos-metadata[920]: Jan 15 23:45:12.837 INFO wrote hostname ci-4459-2-2-n-b7ec270451 to /sysroot/etc/hostname Jan 15 23:45:12.840171 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 15 23:45:12.840274 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 15 23:45:12.842489 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 15 23:45:12.866774 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 23:45:12.898503 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1052) Jan 15 23:45:12.898547 kernel: BTRFS info (device vda6): first mount of filesystem c6a95867-5704-41e1-8beb-48e00b50aef1 Jan 15 23:45:12.900093 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 23:45:12.906455 kernel: BTRFS info (device vda6): turning on async discard Jan 15 23:45:12.906482 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 23:45:12.907836 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 23:45:12.937110 ignition[1070]: INFO : Ignition 2.22.0 Jan 15 23:45:12.937110 ignition[1070]: INFO : Stage: files Jan 15 23:45:12.938652 ignition[1070]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:12.938652 ignition[1070]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:12.938652 ignition[1070]: DEBUG : files: compiled without relabeling support, skipping Jan 15 23:45:12.941345 ignition[1070]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 15 23:45:12.941345 ignition[1070]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 15 23:45:12.945053 ignition[1070]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 15 23:45:12.946159 ignition[1070]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 15 23:45:12.946159 ignition[1070]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 15 23:45:12.945656 unknown[1070]: wrote ssh authorized keys file for user: core Jan 15 23:45:12.949289 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 15 23:45:12.949289 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 15 23:45:13.006875 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 15 23:45:13.111173 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 15 23:45:13.111173 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 23:45:13.114546 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 15 23:45:13.124652 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jan 15 23:45:13.392902 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 15 23:45:14.218332 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 15 23:45:14.218332 ignition[1070]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 15 23:45:14.221852 ignition[1070]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 23:45:14.233731 ignition[1070]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 23:45:14.233731 ignition[1070]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 15 23:45:14.233731 ignition[1070]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 15 23:45:14.239447 ignition[1070]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 15 23:45:14.239447 ignition[1070]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 15 23:45:14.239447 ignition[1070]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 15 23:45:14.239447 ignition[1070]: INFO : files: files passed Jan 15 23:45:14.239447 ignition[1070]: INFO : Ignition finished successfully Jan 15 23:45:14.237552 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 15 23:45:14.239352 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 15 23:45:14.242563 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 15 23:45:14.267260 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 15 23:45:14.267358 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 15 23:45:14.272580 initrd-setup-root-after-ignition[1101]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 23:45:14.272580 initrd-setup-root-after-ignition[1101]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 15 23:45:14.275167 initrd-setup-root-after-ignition[1105]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 23:45:14.275002 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 23:45:14.276385 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 15 23:45:14.279041 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 15 23:45:14.324726 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 15 23:45:14.324833 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 15 23:45:14.328648 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 15 23:45:14.329504 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 15 23:45:14.331054 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 15 23:45:14.331902 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 15 23:45:14.359020 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 23:45:14.361218 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 15 23:45:14.387704 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 15 23:45:14.388732 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 23:45:14.390558 systemd[1]: Stopped target timers.target - Timer Units. Jan 15 23:45:14.392138 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 15 23:45:14.392262 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 23:45:14.394289 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 15 23:45:14.395982 systemd[1]: Stopped target basic.target - Basic System. Jan 15 23:45:14.397249 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 15 23:45:14.398666 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 23:45:14.400258 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 15 23:45:14.401884 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 15 23:45:14.403390 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 15 23:45:14.405003 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 23:45:14.406567 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 15 23:45:14.408120 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 15 23:45:14.409514 systemd[1]: Stopped target swap.target - Swaps. Jan 15 23:45:14.410820 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 15 23:45:14.410947 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 15 23:45:14.412878 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 15 23:45:14.414405 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 23:45:14.416005 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 15 23:45:14.417489 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 23:45:14.418990 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 15 23:45:14.419106 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 15 23:45:14.421341 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 15 23:45:14.421481 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 23:45:14.423089 systemd[1]: ignition-files.service: Deactivated successfully. Jan 15 23:45:14.423187 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 15 23:45:14.425389 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 15 23:45:14.426836 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 15 23:45:14.426961 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 23:45:14.429179 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 15 23:45:14.430539 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 15 23:45:14.430661 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 23:45:14.432279 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 15 23:45:14.432378 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 23:45:14.436818 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 15 23:45:14.439620 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 15 23:45:14.450498 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 15 23:45:14.453493 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 15 23:45:14.453593 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 15 23:45:14.455702 ignition[1125]: INFO : Ignition 2.22.0 Jan 15 23:45:14.455702 ignition[1125]: INFO : Stage: umount Jan 15 23:45:14.455702 ignition[1125]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 23:45:14.455702 ignition[1125]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 23:45:14.459332 ignition[1125]: INFO : umount: umount passed Jan 15 23:45:14.459332 ignition[1125]: INFO : Ignition finished successfully Jan 15 23:45:14.457628 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 15 23:45:14.457708 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 15 23:45:14.458901 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 15 23:45:14.458982 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 15 23:45:14.460209 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 15 23:45:14.460254 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 15 23:45:14.461418 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 15 23:45:14.461487 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 15 23:45:14.462803 systemd[1]: Stopped target network.target - Network. Jan 15 23:45:14.464182 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 15 23:45:14.464236 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 23:45:14.465696 systemd[1]: Stopped target paths.target - Path Units. Jan 15 23:45:14.466937 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 15 23:45:14.470589 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 23:45:14.471494 systemd[1]: Stopped target slices.target - Slice Units. Jan 15 23:45:14.472673 systemd[1]: Stopped target sockets.target - Socket Units. Jan 15 23:45:14.474095 systemd[1]: iscsid.socket: Deactivated successfully. Jan 15 23:45:14.474137 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 23:45:14.475537 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 15 23:45:14.475568 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 23:45:14.476831 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 15 23:45:14.476882 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 15 23:45:14.478233 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 15 23:45:14.478272 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 15 23:45:14.480094 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 15 23:45:14.480142 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 15 23:45:14.481587 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 15 23:45:14.482991 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 15 23:45:14.488705 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 15 23:45:14.488809 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 15 23:45:14.492455 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jan 15 23:45:14.492703 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 15 23:45:14.492740 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 23:45:14.496067 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jan 15 23:45:14.500480 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 15 23:45:14.501493 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 15 23:45:14.504244 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jan 15 23:45:14.504379 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 15 23:45:14.505759 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 15 23:45:14.505799 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 15 23:45:14.508151 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 15 23:45:14.508899 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 15 23:45:14.508954 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 23:45:14.510407 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 15 23:45:14.510465 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 15 23:45:14.512750 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 15 23:45:14.512791 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 15 23:45:14.514378 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 23:45:14.517071 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jan 15 23:45:14.531182 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 15 23:45:14.531594 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 23:45:14.533203 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 15 23:45:14.533236 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 15 23:45:14.534816 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 15 23:45:14.534848 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 23:45:14.536249 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 15 23:45:14.536297 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 15 23:45:14.538451 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 15 23:45:14.538496 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 15 23:45:14.540790 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 23:45:14.540840 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 23:45:14.544165 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 15 23:45:14.545599 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 15 23:45:14.545654 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 23:45:14.548327 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 15 23:45:14.548364 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 23:45:14.551125 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 23:45:14.551162 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:14.554417 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 15 23:45:14.575719 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 15 23:45:14.581546 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 15 23:45:14.581656 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 15 23:45:14.583537 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 15 23:45:14.585789 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 15 23:45:14.620265 systemd[1]: Switching root. Jan 15 23:45:14.652811 systemd-journald[312]: Journal stopped Jan 15 23:45:15.497397 systemd-journald[312]: Received SIGTERM from PID 1 (systemd). Jan 15 23:45:15.497480 kernel: SELinux: policy capability network_peer_controls=1 Jan 15 23:45:15.497494 kernel: SELinux: policy capability open_perms=1 Jan 15 23:45:15.497503 kernel: SELinux: policy capability extended_socket_class=1 Jan 15 23:45:15.497516 kernel: SELinux: policy capability always_check_network=0 Jan 15 23:45:15.497528 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 15 23:45:15.497546 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 15 23:45:15.497559 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 15 23:45:15.497571 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 15 23:45:15.497580 kernel: SELinux: policy capability userspace_initial_context=0 Jan 15 23:45:15.497589 kernel: audit: type=1403 audit(1768520714.790:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 15 23:45:15.497605 systemd[1]: Successfully loaded SELinux policy in 56.818ms. Jan 15 23:45:15.497622 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.551ms. Jan 15 23:45:15.497636 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 23:45:15.497647 systemd[1]: Detected virtualization kvm. Jan 15 23:45:15.497658 systemd[1]: Detected architecture arm64. Jan 15 23:45:15.497668 systemd[1]: Detected first boot. Jan 15 23:45:15.497677 systemd[1]: Hostname set to . Jan 15 23:45:15.497687 systemd[1]: Initializing machine ID from VM UUID. Jan 15 23:45:15.497697 zram_generator::config[1170]: No configuration found. Jan 15 23:45:15.497711 kernel: NET: Registered PF_VSOCK protocol family Jan 15 23:45:15.497721 systemd[1]: Populated /etc with preset unit settings. Jan 15 23:45:15.497731 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jan 15 23:45:15.497742 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 15 23:45:15.497752 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 15 23:45:15.497762 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 15 23:45:15.497775 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 15 23:45:15.497785 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 15 23:45:15.497795 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 15 23:45:15.497805 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 15 23:45:15.497815 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 15 23:45:15.497824 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 15 23:45:15.497836 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 15 23:45:15.497851 systemd[1]: Created slice user.slice - User and Session Slice. Jan 15 23:45:15.497917 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 23:45:15.497943 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 23:45:15.497956 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 15 23:45:15.497966 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 15 23:45:15.497977 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 15 23:45:15.497988 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 23:45:15.498001 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 15 23:45:15.498011 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 23:45:15.498022 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 23:45:15.498032 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 15 23:45:15.498042 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 15 23:45:15.498052 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 15 23:45:15.498061 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 15 23:45:15.498073 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 23:45:15.498085 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 23:45:15.498095 systemd[1]: Reached target slices.target - Slice Units. Jan 15 23:45:15.498105 systemd[1]: Reached target swap.target - Swaps. Jan 15 23:45:15.498116 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 15 23:45:15.498126 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 15 23:45:15.498136 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 15 23:45:15.498146 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 23:45:15.498156 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 23:45:15.498168 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 23:45:15.498178 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 15 23:45:15.498188 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 15 23:45:15.498198 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 15 23:45:15.498207 systemd[1]: Mounting media.mount - External Media Directory... Jan 15 23:45:15.498217 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 15 23:45:15.498227 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 15 23:45:15.498240 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 15 23:45:15.498250 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 15 23:45:15.498261 systemd[1]: Reached target machines.target - Containers. Jan 15 23:45:15.498271 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 15 23:45:15.498282 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 23:45:15.498313 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 23:45:15.498326 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 23:45:15.498336 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 23:45:15.498346 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 23:45:15.498356 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 23:45:15.498368 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 23:45:15.498378 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 23:45:15.498388 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 15 23:45:15.498401 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 15 23:45:15.498411 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 15 23:45:15.498422 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 15 23:45:15.498456 systemd[1]: Stopped systemd-fsck-usr.service. Jan 15 23:45:15.498470 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 23:45:15.498479 kernel: loop: module loaded Jan 15 23:45:15.498489 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 23:45:15.498499 kernel: fuse: init (API version 7.41) Jan 15 23:45:15.498509 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 23:45:15.498523 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 23:45:15.498535 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 15 23:45:15.498550 kernel: ACPI: bus type drm_connector registered Jan 15 23:45:15.498561 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 15 23:45:15.498574 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 23:45:15.498590 systemd[1]: verity-setup.service: Deactivated successfully. Jan 15 23:45:15.498602 systemd[1]: Stopped verity-setup.service. Jan 15 23:45:15.498635 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 15 23:45:15.498653 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 15 23:45:15.498666 systemd[1]: Mounted media.mount - External Media Directory. Jan 15 23:45:15.498676 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 15 23:45:15.498687 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 15 23:45:15.498728 systemd-journald[1238]: Collecting audit messages is disabled. Jan 15 23:45:15.498754 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 15 23:45:15.498766 systemd-journald[1238]: Journal started Jan 15 23:45:15.498787 systemd-journald[1238]: Runtime Journal (/run/log/journal/83774ded3dbc43478b42bd0dbc5602f8) is 8M, max 319.5M, 311.5M free. Jan 15 23:45:15.286146 systemd[1]: Queued start job for default target multi-user.target. Jan 15 23:45:15.309512 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 15 23:45:15.309915 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 15 23:45:15.505465 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 23:45:15.506998 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 23:45:15.507865 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 15 23:45:15.509082 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 23:45:15.509280 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 23:45:15.510603 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 23:45:15.510769 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 23:45:15.511945 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 23:45:15.513483 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 23:45:15.514596 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 23:45:15.514756 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 23:45:15.516068 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 23:45:15.516230 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 23:45:15.517675 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 23:45:15.519471 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 23:45:15.520606 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 23:45:15.521730 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 23:45:15.523117 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 15 23:45:15.524546 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 15 23:45:15.535854 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 23:45:15.538138 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 15 23:45:15.540118 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 15 23:45:15.541044 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 15 23:45:15.541072 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 23:45:15.542804 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 15 23:45:15.550588 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 15 23:45:15.552636 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 23:45:15.554572 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 15 23:45:15.556266 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 15 23:45:15.557251 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 23:45:15.559449 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 15 23:45:15.560360 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 23:45:15.561628 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 23:45:15.570067 systemd-journald[1238]: Time spent on flushing to /var/log/journal/83774ded3dbc43478b42bd0dbc5602f8 is 23.292ms for 1679 entries. Jan 15 23:45:15.570067 systemd-journald[1238]: System Journal (/var/log/journal/83774ded3dbc43478b42bd0dbc5602f8) is 8M, max 584.8M, 576.8M free. Jan 15 23:45:15.611745 systemd-journald[1238]: Received client request to flush runtime journal. Jan 15 23:45:15.611803 kernel: loop0: detected capacity change from 0 to 119840 Jan 15 23:45:15.565694 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 15 23:45:15.569182 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 15 23:45:15.572300 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 15 23:45:15.573627 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 15 23:45:15.583640 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 15 23:45:15.584695 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 15 23:45:15.590750 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 15 23:45:15.592046 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 23:45:15.599203 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 23:45:15.613884 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 15 23:45:15.629096 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 15 23:45:15.647481 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 15 23:45:15.649475 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 15 23:45:15.653676 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 23:45:15.675465 kernel: loop1: detected capacity change from 0 to 100632 Jan 15 23:45:15.703756 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Jan 15 23:45:15.703772 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Jan 15 23:45:15.707685 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 23:45:15.737471 kernel: loop2: detected capacity change from 0 to 1632 Jan 15 23:45:15.777468 kernel: loop3: detected capacity change from 0 to 211168 Jan 15 23:45:15.836460 kernel: loop4: detected capacity change from 0 to 119840 Jan 15 23:45:15.852467 kernel: loop5: detected capacity change from 0 to 100632 Jan 15 23:45:15.876509 kernel: loop6: detected capacity change from 0 to 1632 Jan 15 23:45:15.881462 kernel: loop7: detected capacity change from 0 to 211168 Jan 15 23:45:15.901481 (sd-merge)[1317]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Jan 15 23:45:15.901956 (sd-merge)[1317]: Merged extensions into '/usr'. Jan 15 23:45:15.906657 systemd[1]: Reload requested from client PID 1289 ('systemd-sysext') (unit systemd-sysext.service)... Jan 15 23:45:15.906674 systemd[1]: Reloading... Jan 15 23:45:15.959503 zram_generator::config[1343]: No configuration found. Jan 15 23:45:16.097645 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 15 23:45:16.097875 systemd[1]: Reloading finished in 190 ms. Jan 15 23:45:16.127184 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 15 23:45:16.128637 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 15 23:45:16.143159 systemd[1]: Starting ensure-sysext.service... Jan 15 23:45:16.145127 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 23:45:16.149631 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 23:45:16.159555 systemd[1]: Reload requested from client PID 1380 ('systemctl') (unit ensure-sysext.service)... Jan 15 23:45:16.159575 systemd[1]: Reloading... Jan 15 23:45:16.162825 systemd-tmpfiles[1381]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 15 23:45:16.162868 systemd-tmpfiles[1381]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 15 23:45:16.163124 systemd-tmpfiles[1381]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 15 23:45:16.163326 systemd-tmpfiles[1381]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 15 23:45:16.163951 systemd-tmpfiles[1381]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 15 23:45:16.164163 systemd-tmpfiles[1381]: ACLs are not supported, ignoring. Jan 15 23:45:16.164204 systemd-tmpfiles[1381]: ACLs are not supported, ignoring. Jan 15 23:45:16.167851 systemd-tmpfiles[1381]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 23:45:16.167864 systemd-tmpfiles[1381]: Skipping /boot Jan 15 23:45:16.175589 systemd-tmpfiles[1381]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 23:45:16.175606 systemd-tmpfiles[1381]: Skipping /boot Jan 15 23:45:16.178681 systemd-udevd[1382]: Using default interface naming scheme 'v255'. Jan 15 23:45:16.205470 zram_generator::config[1406]: No configuration found. Jan 15 23:45:16.211706 ldconfig[1284]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 15 23:45:16.365481 kernel: mousedev: PS/2 mouse device common for all mice Jan 15 23:45:16.385215 systemd[1]: Reloading finished in 225 ms. Jan 15 23:45:16.395981 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 23:45:16.397746 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 15 23:45:16.410715 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 23:45:16.426215 systemd[1]: Finished ensure-sysext.service. Jan 15 23:45:16.428645 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 15 23:45:16.440267 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 23:45:16.440939 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 15 23:45:16.440991 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 15 23:45:16.441009 kernel: [drm] features: -context_init Jan 15 23:45:16.444596 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 23:45:16.446460 kernel: [drm] number of scanouts: 1 Jan 15 23:45:16.446578 kernel: [drm] number of cap sets: 0 Jan 15 23:45:16.451539 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 15 23:45:16.455584 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 23:45:16.460758 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 15 23:45:16.460547 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 15 23:45:16.461576 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 23:45:16.463651 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 23:45:16.489239 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 23:45:16.492283 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 23:45:16.495777 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 23:45:16.498959 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 15 23:45:16.500099 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 23:45:16.501012 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 15 23:45:16.502544 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 23:45:16.504713 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 15 23:45:16.509680 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 23:45:16.530137 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 23:45:16.533628 systemd[1]: Reached target time-set.target - System Time Set. Jan 15 23:45:16.537265 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 15 23:45:16.539081 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 23:45:16.540486 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 23:45:16.541671 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 23:45:16.541840 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 23:45:16.544247 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 23:45:16.544411 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 23:45:16.545663 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 23:45:16.545817 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 23:45:16.559458 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 15 23:45:16.560204 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 15 23:45:16.560326 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 23:45:16.560387 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 23:45:16.565645 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 15 23:45:16.570031 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 23:45:16.571670 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 15 23:45:16.573463 kernel: PTP clock support registered Jan 15 23:45:16.575635 augenrules[1544]: No rules Jan 15 23:45:16.576696 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 15 23:45:16.579774 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 15 23:45:16.581524 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 23:45:16.581739 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 23:45:16.583054 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 15 23:45:16.592646 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 15 23:45:16.598040 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 15 23:45:16.603378 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 23:45:16.611633 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:16.614154 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jan 15 23:45:16.615521 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 23:45:16.619715 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 15 23:45:16.621285 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 15 23:45:16.628725 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 15 23:45:16.630002 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 15 23:45:16.676662 systemd-resolved[1524]: Positive Trust Anchors: Jan 15 23:45:16.676992 systemd-resolved[1524]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 23:45:16.677070 systemd-resolved[1524]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 23:45:16.677222 systemd-networkd[1522]: lo: Link UP Jan 15 23:45:16.677230 systemd-networkd[1522]: lo: Gained carrier Jan 15 23:45:16.678282 systemd-networkd[1522]: Enumeration completed Jan 15 23:45:16.678401 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 23:45:16.678731 systemd-networkd[1522]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 23:45:16.678742 systemd-networkd[1522]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 23:45:16.679141 systemd-networkd[1522]: eth0: Link UP Jan 15 23:45:16.679299 systemd-networkd[1522]: eth0: Gained carrier Jan 15 23:45:16.679317 systemd-networkd[1522]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 23:45:16.680784 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 15 23:45:16.683170 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 15 23:45:16.684588 systemd-resolved[1524]: Using system hostname 'ci-4459-2-2-n-b7ec270451'. Jan 15 23:45:16.686090 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 23:45:16.687210 systemd[1]: Reached target network.target - Network. Jan 15 23:45:16.688045 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 23:45:16.702543 systemd-networkd[1522]: eth0: DHCPv4 address 10.0.10.219/25, gateway 10.0.10.129 acquired from 10.0.10.129 Jan 15 23:45:16.704980 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 15 23:45:16.708876 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 23:45:16.710467 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 23:45:16.711392 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 15 23:45:16.712549 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 15 23:45:16.713656 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 15 23:45:16.714579 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 15 23:45:16.715608 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 15 23:45:16.716575 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 15 23:45:16.716612 systemd[1]: Reached target paths.target - Path Units. Jan 15 23:45:16.717288 systemd[1]: Reached target timers.target - Timer Units. Jan 15 23:45:16.720090 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 15 23:45:16.722333 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 15 23:45:16.724980 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 15 23:45:16.726210 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 15 23:45:16.727292 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 15 23:45:16.730330 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 15 23:45:16.731660 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 15 23:45:16.733244 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 15 23:45:16.734271 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 23:45:16.735093 systemd[1]: Reached target basic.target - Basic System. Jan 15 23:45:16.735891 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 15 23:45:16.735924 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 15 23:45:16.738415 systemd[1]: Starting chronyd.service - NTP client/server... Jan 15 23:45:16.740004 systemd[1]: Starting containerd.service - containerd container runtime... Jan 15 23:45:16.741904 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 15 23:45:16.743647 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 15 23:45:16.746430 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 15 23:45:16.748474 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:16.748794 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 15 23:45:16.751048 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 15 23:45:16.751907 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 15 23:45:16.758964 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 15 23:45:16.762398 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 15 23:45:16.764457 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 15 23:45:16.766481 jq[1588]: false Jan 15 23:45:16.768036 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 15 23:45:16.771218 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 15 23:45:16.771788 extend-filesystems[1590]: Found /dev/vda6 Jan 15 23:45:16.773022 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 15 23:45:16.777151 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 15 23:45:16.780458 extend-filesystems[1590]: Found /dev/vda9 Jan 15 23:45:16.777916 systemd[1]: Starting update-engine.service - Update Engine... Jan 15 23:45:16.781701 extend-filesystems[1590]: Checking size of /dev/vda9 Jan 15 23:45:16.782335 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 15 23:45:16.786643 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 15 23:45:16.791001 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 15 23:45:16.791212 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 15 23:45:16.794554 systemd[1]: motdgen.service: Deactivated successfully. Jan 15 23:45:16.797146 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 15 23:45:16.797629 chronyd[1582]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 15 23:45:16.798384 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 15 23:45:16.799118 chronyd[1582]: Loaded seccomp filter (level 2) Jan 15 23:45:16.800409 extend-filesystems[1590]: Resized partition /dev/vda9 Jan 15 23:45:16.802700 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 15 23:45:16.803559 extend-filesystems[1619]: resize2fs 1.47.3 (8-Jul-2025) Jan 15 23:45:16.820082 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Jan 15 23:45:16.813690 systemd[1]: Started chronyd.service - NTP client/server. Jan 15 23:45:16.820245 jq[1608]: true Jan 15 23:45:16.827242 update_engine[1606]: I20260115 23:45:16.826954 1606 main.cc:92] Flatcar Update Engine starting Jan 15 23:45:16.831780 (ntainerd)[1623]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 15 23:45:16.842450 jq[1621]: true Jan 15 23:45:16.846136 tar[1614]: linux-arm64/LICENSE Jan 15 23:45:16.849658 tar[1614]: linux-arm64/helm Jan 15 23:45:16.864027 systemd-logind[1599]: New seat seat0. Jan 15 23:45:16.868068 systemd-logind[1599]: Watching system buttons on /dev/input/event0 (Power Button) Jan 15 23:45:16.868095 systemd-logind[1599]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 15 23:45:16.868361 systemd[1]: Started systemd-logind.service - User Login Management. Jan 15 23:45:16.877123 dbus-daemon[1585]: [system] SELinux support is enabled Jan 15 23:45:16.877447 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 15 23:45:16.880655 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 15 23:45:16.880698 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 15 23:45:16.881780 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 15 23:45:16.881806 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 15 23:45:16.884406 systemd[1]: Started update-engine.service - Update Engine. Jan 15 23:45:16.884806 dbus-daemon[1585]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 15 23:45:16.885025 update_engine[1606]: I20260115 23:45:16.884963 1606 update_check_scheduler.cc:74] Next update check in 7m22s Jan 15 23:45:16.887767 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 15 23:45:16.942696 locksmithd[1645]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 15 23:45:17.005885 bash[1649]: Updated "/home/core/.ssh/authorized_keys" Jan 15 23:45:17.008478 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 15 23:45:17.013714 systemd[1]: Starting sshkeys.service... Jan 15 23:45:17.040222 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 15 23:45:17.040849 containerd[1623]: time="2026-01-15T23:45:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 15 23:45:17.042413 containerd[1623]: time="2026-01-15T23:45:17.042261360Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Jan 15 23:45:17.044778 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 15 23:45:17.057380 containerd[1623]: time="2026-01-15T23:45:17.057334080Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.08µs" Jan 15 23:45:17.057823 containerd[1623]: time="2026-01-15T23:45:17.057792400Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 15 23:45:17.057941 containerd[1623]: time="2026-01-15T23:45:17.057911560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 15 23:45:17.058536 containerd[1623]: time="2026-01-15T23:45:17.058343720Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 15 23:45:17.058624 containerd[1623]: time="2026-01-15T23:45:17.058608320Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 15 23:45:17.058874 containerd[1623]: time="2026-01-15T23:45:17.058846680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 23:45:17.059491 containerd[1623]: time="2026-01-15T23:45:17.059016320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 23:45:17.059491 containerd[1623]: time="2026-01-15T23:45:17.059204720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 23:45:17.060093 containerd[1623]: time="2026-01-15T23:45:17.059911720Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 23:45:17.060184 containerd[1623]: time="2026-01-15T23:45:17.060163560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 23:45:17.060405 containerd[1623]: time="2026-01-15T23:45:17.060382840Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 23:45:17.060886 containerd[1623]: time="2026-01-15T23:45:17.060595440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 15 23:45:17.060886 containerd[1623]: time="2026-01-15T23:45:17.060700680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 15 23:45:17.061509 containerd[1623]: time="2026-01-15T23:45:17.061485160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 23:45:17.061803 containerd[1623]: time="2026-01-15T23:45:17.061780200Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 23:45:17.061868 containerd[1623]: time="2026-01-15T23:45:17.061854880Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 15 23:45:17.061951 containerd[1623]: time="2026-01-15T23:45:17.061937280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 15 23:45:17.062394 containerd[1623]: time="2026-01-15T23:45:17.062368320Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 15 23:45:17.062688 containerd[1623]: time="2026-01-15T23:45:17.062665960Z" level=info msg="metadata content store policy set" policy=shared Jan 15 23:45:17.063473 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:17.088277 containerd[1623]: time="2026-01-15T23:45:17.088192760Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088716960Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088746440Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088761520Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088775680Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088786880Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088891040Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088905120Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088918440Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088928280Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088937280Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.088954040Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.089218480Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.089244360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 15 23:45:17.089467 containerd[1623]: time="2026-01-15T23:45:17.089311680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089326920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089337880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089350320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089361640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089372000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089388440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089400680Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 15 23:45:17.089747 containerd[1623]: time="2026-01-15T23:45:17.089411000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 15 23:45:17.090158 containerd[1623]: time="2026-01-15T23:45:17.090134520Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 15 23:45:17.090227 containerd[1623]: time="2026-01-15T23:45:17.090216120Z" level=info msg="Start snapshots syncer" Jan 15 23:45:17.090294 containerd[1623]: time="2026-01-15T23:45:17.090281880Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 15 23:45:17.090778 containerd[1623]: time="2026-01-15T23:45:17.090735560Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 15 23:45:17.090950 containerd[1623]: time="2026-01-15T23:45:17.090933160Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 15 23:45:17.091051 containerd[1623]: time="2026-01-15T23:45:17.091038000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 15 23:45:17.091338 containerd[1623]: time="2026-01-15T23:45:17.091318320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 15 23:45:17.091426 containerd[1623]: time="2026-01-15T23:45:17.091412360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 15 23:45:17.091503 containerd[1623]: time="2026-01-15T23:45:17.091490320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 15 23:45:17.091553 containerd[1623]: time="2026-01-15T23:45:17.091540560Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 15 23:45:17.091620 containerd[1623]: time="2026-01-15T23:45:17.091606680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 15 23:45:17.091676 containerd[1623]: time="2026-01-15T23:45:17.091663160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 15 23:45:17.091734 containerd[1623]: time="2026-01-15T23:45:17.091722800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 15 23:45:17.091805 containerd[1623]: time="2026-01-15T23:45:17.091792400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 15 23:45:17.091862 containerd[1623]: time="2026-01-15T23:45:17.091850200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 15 23:45:17.091911 containerd[1623]: time="2026-01-15T23:45:17.091899720Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 15 23:45:17.091991 containerd[1623]: time="2026-01-15T23:45:17.091978200Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092088160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092105280Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092115520Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092126000Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092136240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092147400Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092242400Z" level=info msg="runtime interface created" Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092247760Z" level=info msg="created NRI interface" Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092255480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092267040Z" level=info msg="Connect containerd service" Jan 15 23:45:17.092318 containerd[1623]: time="2026-01-15T23:45:17.092290120Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 15 23:45:17.093425 containerd[1623]: time="2026-01-15T23:45:17.093396120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 23:45:17.176457 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Jan 15 23:45:17.178457 containerd[1623]: time="2026-01-15T23:45:17.178388320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 15 23:45:17.178670 containerd[1623]: time="2026-01-15T23:45:17.178629080Z" level=info msg="Start subscribing containerd event" Jan 15 23:45:17.178757 containerd[1623]: time="2026-01-15T23:45:17.178744280Z" level=info msg="Start recovering state" Jan 15 23:45:17.178886 containerd[1623]: time="2026-01-15T23:45:17.178872800Z" level=info msg="Start event monitor" Jan 15 23:45:17.178950 containerd[1623]: time="2026-01-15T23:45:17.178938760Z" level=info msg="Start cni network conf syncer for default" Jan 15 23:45:17.179001 containerd[1623]: time="2026-01-15T23:45:17.178990480Z" level=info msg="Start streaming server" Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179031240Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179041120Z" level=info msg="runtime interface starting up..." Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179047200Z" level=info msg="starting plugins..." Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179064200Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179250880Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 15 23:45:17.196509 containerd[1623]: time="2026-01-15T23:45:17.179332160Z" level=info msg="containerd successfully booted in 0.140468s" Jan 15 23:45:17.179422 systemd[1]: Started containerd.service - containerd container runtime. Jan 15 23:45:17.199336 extend-filesystems[1619]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 15 23:45:17.199336 extend-filesystems[1619]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 15 23:45:17.199336 extend-filesystems[1619]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Jan 15 23:45:17.202259 extend-filesystems[1590]: Resized filesystem in /dev/vda9 Jan 15 23:45:17.200731 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 15 23:45:17.202480 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 15 23:45:17.214677 sshd_keygen[1618]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 15 23:45:17.234419 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 15 23:45:17.237087 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 15 23:45:17.255580 systemd[1]: issuegen.service: Deactivated successfully. Jan 15 23:45:17.255781 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 15 23:45:17.258323 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 15 23:45:17.274182 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 15 23:45:17.276457 systemd[1]: Started sshd@0-10.0.10.219:22-68.220.241.50:37366.service - OpenSSH per-connection server daemon (68.220.241.50:37366). Jan 15 23:45:17.278158 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 15 23:45:17.283768 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 15 23:45:17.285582 tar[1614]: linux-arm64/README.md Jan 15 23:45:17.286069 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 15 23:45:17.287358 systemd[1]: Reached target getty.target - Login Prompts. Jan 15 23:45:17.311400 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 15 23:45:17.764464 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:17.863885 systemd-networkd[1522]: eth0: Gained IPv6LL Jan 15 23:45:17.866488 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 15 23:45:17.868317 systemd[1]: Reached target network-online.target - Network is Online. Jan 15 23:45:17.870987 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:45:17.873233 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 15 23:45:17.899346 sshd[1697]: Accepted publickey for core from 68.220.241.50 port 37366 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:17.901146 sshd-session[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:17.903026 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 15 23:45:17.910527 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 15 23:45:17.912596 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 15 23:45:17.920121 systemd-logind[1599]: New session 1 of user core. Jan 15 23:45:17.933114 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 15 23:45:17.937126 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 15 23:45:17.952680 (systemd)[1721]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 15 23:45:17.956192 systemd-logind[1599]: New session c1 of user core. Jan 15 23:45:18.075574 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:18.083847 systemd[1721]: Queued start job for default target default.target. Jan 15 23:45:18.100556 systemd[1721]: Created slice app.slice - User Application Slice. Jan 15 23:45:18.100592 systemd[1721]: Reached target paths.target - Paths. Jan 15 23:45:18.100631 systemd[1721]: Reached target timers.target - Timers. Jan 15 23:45:18.101824 systemd[1721]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 15 23:45:18.111524 systemd[1721]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 15 23:45:18.111581 systemd[1721]: Reached target sockets.target - Sockets. Jan 15 23:45:18.111621 systemd[1721]: Reached target basic.target - Basic System. Jan 15 23:45:18.111646 systemd[1721]: Reached target default.target - Main User Target. Jan 15 23:45:18.111671 systemd[1721]: Startup finished in 149ms. Jan 15 23:45:18.111987 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 15 23:45:18.114362 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 15 23:45:18.556097 systemd[1]: Started sshd@1-10.0.10.219:22-68.220.241.50:37368.service - OpenSSH per-connection server daemon (68.220.241.50:37368). Jan 15 23:45:18.794301 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:45:18.814958 (kubelet)[1741]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 23:45:19.182701 sshd[1733]: Accepted publickey for core from 68.220.241.50 port 37368 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:19.183985 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:19.189237 systemd-logind[1599]: New session 2 of user core. Jan 15 23:45:19.199685 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 15 23:45:19.382550 kubelet[1741]: E0115 23:45:19.382467 1741 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 23:45:19.385232 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 23:45:19.385363 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 23:45:19.385691 systemd[1]: kubelet.service: Consumed 791ms CPU time, 259.7M memory peak. Jan 15 23:45:19.624521 sshd[1748]: Connection closed by 68.220.241.50 port 37368 Jan 15 23:45:19.623868 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:19.627623 systemd[1]: sshd@1-10.0.10.219:22-68.220.241.50:37368.service: Deactivated successfully. Jan 15 23:45:19.629700 systemd[1]: session-2.scope: Deactivated successfully. Jan 15 23:45:19.631048 systemd-logind[1599]: Session 2 logged out. Waiting for processes to exit. Jan 15 23:45:19.632720 systemd-logind[1599]: Removed session 2. Jan 15 23:45:19.745499 systemd[1]: Started sshd@2-10.0.10.219:22-68.220.241.50:37380.service - OpenSSH per-connection server daemon (68.220.241.50:37380). Jan 15 23:45:19.779486 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:20.083482 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:20.389492 sshd[1756]: Accepted publickey for core from 68.220.241.50 port 37380 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:20.390114 sshd-session[1756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:20.395185 systemd-logind[1599]: New session 3 of user core. Jan 15 23:45:20.405839 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 15 23:45:20.829544 sshd[1761]: Connection closed by 68.220.241.50 port 37380 Jan 15 23:45:20.829745 sshd-session[1756]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:20.834129 systemd[1]: sshd@2-10.0.10.219:22-68.220.241.50:37380.service: Deactivated successfully. Jan 15 23:45:20.835928 systemd[1]: session-3.scope: Deactivated successfully. Jan 15 23:45:20.838144 systemd-logind[1599]: Session 3 logged out. Waiting for processes to exit. Jan 15 23:45:20.839280 systemd-logind[1599]: Removed session 3. Jan 15 23:45:23.787481 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:23.792995 coreos-metadata[1584]: Jan 15 23:45:23.792 WARN failed to locate config-drive, using the metadata service API instead Jan 15 23:45:23.807444 coreos-metadata[1584]: Jan 15 23:45:23.807 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 15 23:45:24.095491 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 23:45:24.101482 coreos-metadata[1664]: Jan 15 23:45:24.101 WARN failed to locate config-drive, using the metadata service API instead Jan 15 23:45:24.114060 coreos-metadata[1664]: Jan 15 23:45:24.113 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 15 23:45:26.223866 coreos-metadata[1584]: Jan 15 23:45:26.223 INFO Fetch successful Jan 15 23:45:26.224526 coreos-metadata[1584]: Jan 15 23:45:26.224 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 23:45:26.225866 coreos-metadata[1664]: Jan 15 23:45:26.225 INFO Fetch successful Jan 15 23:45:26.225866 coreos-metadata[1664]: Jan 15 23:45:26.225 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 15 23:45:29.635872 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 15 23:45:29.637344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:45:29.770314 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:45:29.774230 (kubelet)[1782]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 23:45:29.810885 kubelet[1782]: E0115 23:45:29.810822 1782 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 23:45:29.814022 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 23:45:29.814147 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 23:45:29.816540 systemd[1]: kubelet.service: Consumed 143ms CPU time, 106.4M memory peak. Jan 15 23:45:29.821817 coreos-metadata[1664]: Jan 15 23:45:29.821 INFO Fetch successful Jan 15 23:45:29.824021 unknown[1664]: wrote ssh authorized keys file for user: core Jan 15 23:45:29.846303 update-ssh-keys[1792]: Updated "/home/core/.ssh/authorized_keys" Jan 15 23:45:29.847285 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 15 23:45:29.849297 systemd[1]: Finished sshkeys.service. Jan 15 23:45:30.431823 coreos-metadata[1584]: Jan 15 23:45:30.431 INFO Fetch successful Jan 15 23:45:30.431823 coreos-metadata[1584]: Jan 15 23:45:30.431 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 15 23:45:30.949030 systemd[1]: Started sshd@3-10.0.10.219:22-68.220.241.50:60270.service - OpenSSH per-connection server daemon (68.220.241.50:60270). Jan 15 23:45:31.556738 sshd[1796]: Accepted publickey for core from 68.220.241.50 port 60270 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:31.558117 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:31.562485 systemd-logind[1599]: New session 4 of user core. Jan 15 23:45:31.573640 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 15 23:45:31.663400 coreos-metadata[1584]: Jan 15 23:45:31.663 INFO Fetch successful Jan 15 23:45:31.663400 coreos-metadata[1584]: Jan 15 23:45:31.663 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 15 23:45:31.984625 sshd[1799]: Connection closed by 68.220.241.50 port 60270 Jan 15 23:45:31.985063 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:31.988498 systemd[1]: sshd@3-10.0.10.219:22-68.220.241.50:60270.service: Deactivated successfully. Jan 15 23:45:31.990001 systemd[1]: session-4.scope: Deactivated successfully. Jan 15 23:45:31.992620 systemd-logind[1599]: Session 4 logged out. Waiting for processes to exit. Jan 15 23:45:31.994126 systemd-logind[1599]: Removed session 4. Jan 15 23:45:32.097917 systemd[1]: Started sshd@4-10.0.10.219:22-68.220.241.50:60276.service - OpenSSH per-connection server daemon (68.220.241.50:60276). Jan 15 23:45:32.723293 sshd[1805]: Accepted publickey for core from 68.220.241.50 port 60276 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:32.724555 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:32.728493 systemd-logind[1599]: New session 5 of user core. Jan 15 23:45:32.740821 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 15 23:45:32.903507 coreos-metadata[1584]: Jan 15 23:45:32.903 INFO Fetch successful Jan 15 23:45:32.903507 coreos-metadata[1584]: Jan 15 23:45:32.903 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 15 23:45:33.163608 sshd[1808]: Connection closed by 68.220.241.50 port 60276 Jan 15 23:45:33.163942 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:33.167818 systemd[1]: sshd@4-10.0.10.219:22-68.220.241.50:60276.service: Deactivated successfully. Jan 15 23:45:33.169690 systemd[1]: session-5.scope: Deactivated successfully. Jan 15 23:45:33.170616 systemd-logind[1599]: Session 5 logged out. Waiting for processes to exit. Jan 15 23:45:33.171633 systemd-logind[1599]: Removed session 5. Jan 15 23:45:34.164256 coreos-metadata[1584]: Jan 15 23:45:34.164 INFO Fetch successful Jan 15 23:45:34.164256 coreos-metadata[1584]: Jan 15 23:45:34.164 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 15 23:45:38.804179 coreos-metadata[1584]: Jan 15 23:45:38.804 INFO Fetch successful Jan 15 23:45:38.836065 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 15 23:45:38.836511 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 15 23:45:38.836630 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 15 23:45:38.840563 systemd[1]: Startup finished in 3.111s (kernel) + 13.181s (initrd) + 24.107s (userspace) = 40.400s. Jan 15 23:45:39.962803 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 15 23:45:39.964251 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:45:40.106369 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:45:40.110558 (kubelet)[1826]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 23:45:40.144048 kubelet[1826]: E0115 23:45:40.144002 1826 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 23:45:40.146711 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 23:45:40.146852 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 23:45:40.147363 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.5M memory peak. Jan 15 23:45:40.581262 chronyd[1582]: Selected source PHC0 Jan 15 23:45:43.051619 systemd[1]: Started sshd@5-10.0.10.219:22-68.220.241.50:35796.service - OpenSSH per-connection server daemon (68.220.241.50:35796). Jan 15 23:45:43.633221 sshd[1835]: Accepted publickey for core from 68.220.241.50 port 35796 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:43.634367 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:43.638297 systemd-logind[1599]: New session 6 of user core. Jan 15 23:45:43.644560 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 15 23:45:44.037315 sshd[1838]: Connection closed by 68.220.241.50 port 35796 Jan 15 23:45:44.036370 sshd-session[1835]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:44.039106 systemd[1]: sshd@5-10.0.10.219:22-68.220.241.50:35796.service: Deactivated successfully. Jan 15 23:45:44.040848 systemd[1]: session-6.scope: Deactivated successfully. Jan 15 23:45:44.041992 systemd-logind[1599]: Session 6 logged out. Waiting for processes to exit. Jan 15 23:45:44.044663 systemd-logind[1599]: Removed session 6. Jan 15 23:45:44.133622 systemd[1]: Started sshd@6-10.0.10.219:22-68.220.241.50:35812.service - OpenSSH per-connection server daemon (68.220.241.50:35812). Jan 15 23:45:44.715046 sshd[1844]: Accepted publickey for core from 68.220.241.50 port 35812 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:44.716184 sshd-session[1844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:44.719538 systemd-logind[1599]: New session 7 of user core. Jan 15 23:45:44.730726 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 15 23:45:45.105095 sshd[1847]: Connection closed by 68.220.241.50 port 35812 Jan 15 23:45:45.105519 sshd-session[1844]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:45.108647 systemd[1]: sshd@6-10.0.10.219:22-68.220.241.50:35812.service: Deactivated successfully. Jan 15 23:45:45.110124 systemd[1]: session-7.scope: Deactivated successfully. Jan 15 23:45:45.112453 systemd-logind[1599]: Session 7 logged out. Waiting for processes to exit. Jan 15 23:45:45.113755 systemd-logind[1599]: Removed session 7. Jan 15 23:45:45.211110 systemd[1]: Started sshd@7-10.0.10.219:22-68.220.241.50:35822.service - OpenSSH per-connection server daemon (68.220.241.50:35822). Jan 15 23:45:45.800994 sshd[1853]: Accepted publickey for core from 68.220.241.50 port 35822 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:45.802237 sshd-session[1853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:45.806560 systemd-logind[1599]: New session 8 of user core. Jan 15 23:45:45.819631 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 15 23:45:46.241613 sshd[1856]: Connection closed by 68.220.241.50 port 35822 Jan 15 23:45:46.242141 sshd-session[1853]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:46.245587 systemd[1]: sshd@7-10.0.10.219:22-68.220.241.50:35822.service: Deactivated successfully. Jan 15 23:45:46.248889 systemd[1]: session-8.scope: Deactivated successfully. Jan 15 23:45:46.249645 systemd-logind[1599]: Session 8 logged out. Waiting for processes to exit. Jan 15 23:45:46.250817 systemd-logind[1599]: Removed session 8. Jan 15 23:45:46.351766 systemd[1]: Started sshd@8-10.0.10.219:22-68.220.241.50:35824.service - OpenSSH per-connection server daemon (68.220.241.50:35824). Jan 15 23:45:46.991358 sshd[1862]: Accepted publickey for core from 68.220.241.50 port 35824 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:46.992888 sshd-session[1862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:46.996995 systemd-logind[1599]: New session 9 of user core. Jan 15 23:45:47.003602 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 15 23:45:47.350217 sudo[1866]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 15 23:45:47.350488 sudo[1866]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 23:45:47.362363 sudo[1866]: pam_unix(sudo:session): session closed for user root Jan 15 23:45:47.462046 sshd[1865]: Connection closed by 68.220.241.50 port 35824 Jan 15 23:45:47.462575 sshd-session[1862]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:47.466317 systemd[1]: sshd@8-10.0.10.219:22-68.220.241.50:35824.service: Deactivated successfully. Jan 15 23:45:47.467945 systemd[1]: session-9.scope: Deactivated successfully. Jan 15 23:45:47.470196 systemd-logind[1599]: Session 9 logged out. Waiting for processes to exit. Jan 15 23:45:47.471728 systemd-logind[1599]: Removed session 9. Jan 15 23:45:47.580238 systemd[1]: Started sshd@9-10.0.10.219:22-68.220.241.50:35832.service - OpenSSH per-connection server daemon (68.220.241.50:35832). Jan 15 23:45:48.212151 sshd[1872]: Accepted publickey for core from 68.220.241.50 port 35832 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:48.213597 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:48.218186 systemd-logind[1599]: New session 10 of user core. Jan 15 23:45:48.228644 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 15 23:45:48.544884 sudo[1877]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 15 23:45:48.545176 sudo[1877]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 23:45:48.549875 sudo[1877]: pam_unix(sudo:session): session closed for user root Jan 15 23:45:48.554887 sudo[1876]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 15 23:45:48.555153 sudo[1876]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 23:45:48.563610 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 23:45:48.597642 augenrules[1899]: No rules Jan 15 23:45:48.598813 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 23:45:48.600524 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 23:45:48.601839 sudo[1876]: pam_unix(sudo:session): session closed for user root Jan 15 23:45:48.698249 sshd[1875]: Connection closed by 68.220.241.50 port 35832 Jan 15 23:45:48.698747 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Jan 15 23:45:48.702228 systemd-logind[1599]: Session 10 logged out. Waiting for processes to exit. Jan 15 23:45:48.702491 systemd[1]: sshd@9-10.0.10.219:22-68.220.241.50:35832.service: Deactivated successfully. Jan 15 23:45:48.704051 systemd[1]: session-10.scope: Deactivated successfully. Jan 15 23:45:48.708054 systemd-logind[1599]: Removed session 10. Jan 15 23:45:48.808826 systemd[1]: Started sshd@10-10.0.10.219:22-68.220.241.50:35840.service - OpenSSH per-connection server daemon (68.220.241.50:35840). Jan 15 23:45:49.438164 sshd[1908]: Accepted publickey for core from 68.220.241.50 port 35840 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:45:49.439453 sshd-session[1908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:45:49.443901 systemd-logind[1599]: New session 11 of user core. Jan 15 23:45:49.455701 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 15 23:45:49.770729 sudo[1912]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 15 23:45:49.770987 sudo[1912]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 23:45:50.099637 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 15 23:45:50.118802 (dockerd)[1934]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 15 23:45:50.212318 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 15 23:45:50.214110 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:45:50.634902 dockerd[1934]: time="2026-01-15T23:45:50.634838428Z" level=info msg="Starting up" Jan 15 23:45:50.635755 dockerd[1934]: time="2026-01-15T23:45:50.635694032Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 15 23:45:50.646270 dockerd[1934]: time="2026-01-15T23:45:50.646216803Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 15 23:45:51.016111 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:45:51.029883 (kubelet)[1965]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 23:45:51.063036 kubelet[1965]: E0115 23:45:51.062971 1965 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 23:45:51.065683 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 23:45:51.065814 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 23:45:51.067677 systemd[1]: kubelet.service: Consumed 139ms CPU time, 107.4M memory peak. Jan 15 23:45:51.133961 dockerd[1934]: time="2026-01-15T23:45:51.133780838Z" level=info msg="Loading containers: start." Jan 15 23:45:51.143667 kernel: Initializing XFRM netlink socket Jan 15 23:45:51.378621 systemd-networkd[1522]: docker0: Link UP Jan 15 23:45:51.384321 dockerd[1934]: time="2026-01-15T23:45:51.384268248Z" level=info msg="Loading containers: done." Jan 15 23:45:51.399226 dockerd[1934]: time="2026-01-15T23:45:51.398883639Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 15 23:45:51.399226 dockerd[1934]: time="2026-01-15T23:45:51.398982639Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 15 23:45:51.399226 dockerd[1934]: time="2026-01-15T23:45:51.399074320Z" level=info msg="Initializing buildkit" Jan 15 23:45:51.424207 dockerd[1934]: time="2026-01-15T23:45:51.424170521Z" level=info msg="Completed buildkit initialization" Jan 15 23:45:51.431411 dockerd[1934]: time="2026-01-15T23:45:51.431365236Z" level=info msg="Daemon has completed initialization" Jan 15 23:45:51.431634 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 15 23:45:51.431884 dockerd[1934]: time="2026-01-15T23:45:51.431598877Z" level=info msg="API listen on /run/docker.sock" Jan 15 23:45:52.960846 containerd[1623]: time="2026-01-15T23:45:52.960806144Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 15 23:45:53.563786 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount548209757.mount: Deactivated successfully. Jan 15 23:45:54.550534 containerd[1623]: time="2026-01-15T23:45:54.550476304Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:54.553444 containerd[1623]: time="2026-01-15T23:45:54.553387038Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=27387379" Jan 15 23:45:54.554792 containerd[1623]: time="2026-01-15T23:45:54.554763725Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:54.558032 containerd[1623]: time="2026-01-15T23:45:54.557985260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:54.559482 containerd[1623]: time="2026-01-15T23:45:54.559429747Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.598582003s" Jan 15 23:45:54.559525 containerd[1623]: time="2026-01-15T23:45:54.559482788Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Jan 15 23:45:54.561243 containerd[1623]: time="2026-01-15T23:45:54.561211476Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 15 23:45:55.692361 containerd[1623]: time="2026-01-15T23:45:55.691415976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:55.692361 containerd[1623]: time="2026-01-15T23:45:55.692323740Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23553101" Jan 15 23:45:55.693316 containerd[1623]: time="2026-01-15T23:45:55.693285425Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:55.697059 containerd[1623]: time="2026-01-15T23:45:55.697006363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:55.698076 containerd[1623]: time="2026-01-15T23:45:55.698016968Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.136774732s" Jan 15 23:45:55.698076 containerd[1623]: time="2026-01-15T23:45:55.698058128Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Jan 15 23:45:55.698984 containerd[1623]: time="2026-01-15T23:45:55.698762691Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 15 23:45:56.842707 containerd[1623]: time="2026-01-15T23:45:56.842661336Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:56.843827 containerd[1623]: time="2026-01-15T23:45:56.843798182Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18298087" Jan 15 23:45:56.845141 containerd[1623]: time="2026-01-15T23:45:56.845093388Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:56.848033 containerd[1623]: time="2026-01-15T23:45:56.847984002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:56.849951 containerd[1623]: time="2026-01-15T23:45:56.849871011Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.15107244s" Jan 15 23:45:56.849951 containerd[1623]: time="2026-01-15T23:45:56.849913491Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Jan 15 23:45:56.850452 containerd[1623]: time="2026-01-15T23:45:56.850418854Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 15 23:45:57.808175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2114262982.mount: Deactivated successfully. Jan 15 23:45:58.065709 containerd[1623]: time="2026-01-15T23:45:58.065592757Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:58.067247 containerd[1623]: time="2026-01-15T23:45:58.067213405Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28258699" Jan 15 23:45:58.068148 containerd[1623]: time="2026-01-15T23:45:58.068126090Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:58.070749 containerd[1623]: time="2026-01-15T23:45:58.070678942Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:58.071338 containerd[1623]: time="2026-01-15T23:45:58.071255785Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.220792771s" Jan 15 23:45:58.071338 containerd[1623]: time="2026-01-15T23:45:58.071297025Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Jan 15 23:45:58.071778 containerd[1623]: time="2026-01-15T23:45:58.071742707Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 15 23:45:58.650111 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2732432423.mount: Deactivated successfully. Jan 15 23:45:59.458242 containerd[1623]: time="2026-01-15T23:45:59.457824116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:59.458988 containerd[1623]: time="2026-01-15T23:45:59.458814760Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Jan 15 23:45:59.459783 containerd[1623]: time="2026-01-15T23:45:59.459759525Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:59.463051 containerd[1623]: time="2026-01-15T23:45:59.463010661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:45:59.464194 containerd[1623]: time="2026-01-15T23:45:59.464168826Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.392312598s" Jan 15 23:45:59.464258 containerd[1623]: time="2026-01-15T23:45:59.464195666Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jan 15 23:45:59.464718 containerd[1623]: time="2026-01-15T23:45:59.464695789Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 15 23:45:59.955821 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3247268520.mount: Deactivated successfully. Jan 15 23:45:59.969465 containerd[1623]: time="2026-01-15T23:45:59.968474540Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jan 15 23:45:59.969465 containerd[1623]: time="2026-01-15T23:45:59.968540100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 23:45:59.970550 containerd[1623]: time="2026-01-15T23:45:59.970502390Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 23:45:59.971348 containerd[1623]: time="2026-01-15T23:45:59.971318794Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 23:45:59.972087 containerd[1623]: time="2026-01-15T23:45:59.972050757Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 506.980366ms" Jan 15 23:45:59.972087 containerd[1623]: time="2026-01-15T23:45:59.972083797Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 15 23:45:59.972533 containerd[1623]: time="2026-01-15T23:45:59.972486879Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 15 23:46:00.521720 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3078351490.mount: Deactivated successfully. Jan 15 23:46:01.212691 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 15 23:46:01.215048 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:46:01.326565 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:01.330654 (kubelet)[2354]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 23:46:01.428333 kubelet[2354]: E0115 23:46:01.428272 2354 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 23:46:01.430755 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 23:46:01.430982 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 23:46:01.431613 systemd[1]: kubelet.service: Consumed 138ms CPU time, 105.4M memory peak. Jan 15 23:46:02.449002 update_engine[1606]: I20260115 23:46:02.448915 1606 update_attempter.cc:509] Updating boot flags... Jan 15 23:46:02.839845 containerd[1623]: time="2026-01-15T23:46:02.839770966Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:02.840818 containerd[1623]: time="2026-01-15T23:46:02.840773250Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=70013713" Jan 15 23:46:02.842329 containerd[1623]: time="2026-01-15T23:46:02.842278578Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:02.847260 containerd[1623]: time="2026-01-15T23:46:02.846491438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:02.847260 containerd[1623]: time="2026-01-15T23:46:02.847138441Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.874618762s" Jan 15 23:46:02.847260 containerd[1623]: time="2026-01-15T23:46:02.847173521Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jan 15 23:46:08.553560 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:08.553705 systemd[1]: kubelet.service: Consumed 138ms CPU time, 105.4M memory peak. Jan 15 23:46:08.555635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:46:08.580515 systemd[1]: Reload requested from client PID 2418 ('systemctl') (unit session-11.scope)... Jan 15 23:46:08.580534 systemd[1]: Reloading... Jan 15 23:46:08.660495 zram_generator::config[2464]: No configuration found. Jan 15 23:46:08.823943 systemd[1]: Reloading finished in 243 ms. Jan 15 23:46:08.887201 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 15 23:46:08.887289 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 15 23:46:08.887653 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:08.887708 systemd[1]: kubelet.service: Consumed 92ms CPU time, 94.9M memory peak. Jan 15 23:46:08.889376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:46:09.145481 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:09.149701 (kubelet)[2509]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 23:46:09.872573 kubelet[2509]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 23:46:09.872573 kubelet[2509]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 23:46:09.872573 kubelet[2509]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 23:46:09.872926 kubelet[2509]: I0115 23:46:09.872615 2509 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 23:46:10.780623 kubelet[2509]: I0115 23:46:10.780568 2509 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 15 23:46:10.780623 kubelet[2509]: I0115 23:46:10.780608 2509 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 23:46:10.780888 kubelet[2509]: I0115 23:46:10.780869 2509 server.go:956] "Client rotation is on, will bootstrap in background" Jan 15 23:46:10.817381 kubelet[2509]: E0115 23:46:10.817323 2509 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.10.219:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.10.219:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 15 23:46:10.821906 kubelet[2509]: I0115 23:46:10.821857 2509 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 23:46:10.831472 kubelet[2509]: I0115 23:46:10.831448 2509 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 23:46:10.834495 kubelet[2509]: I0115 23:46:10.834461 2509 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 23:46:10.834829 kubelet[2509]: I0115 23:46:10.834799 2509 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 23:46:10.834976 kubelet[2509]: I0115 23:46:10.834829 2509 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-b7ec270451","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 23:46:10.835074 kubelet[2509]: I0115 23:46:10.835062 2509 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 23:46:10.835074 kubelet[2509]: I0115 23:46:10.835073 2509 container_manager_linux.go:303] "Creating device plugin manager" Jan 15 23:46:10.836158 kubelet[2509]: I0115 23:46:10.836120 2509 state_mem.go:36] "Initialized new in-memory state store" Jan 15 23:46:10.840263 kubelet[2509]: I0115 23:46:10.840224 2509 kubelet.go:480] "Attempting to sync node with API server" Jan 15 23:46:10.840368 kubelet[2509]: I0115 23:46:10.840256 2509 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 23:46:10.840412 kubelet[2509]: I0115 23:46:10.840388 2509 kubelet.go:386] "Adding apiserver pod source" Jan 15 23:46:10.843046 kubelet[2509]: I0115 23:46:10.842748 2509 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 23:46:10.844368 kubelet[2509]: I0115 23:46:10.844339 2509 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 15 23:46:10.845112 kubelet[2509]: I0115 23:46:10.845077 2509 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 15 23:46:10.845248 kubelet[2509]: W0115 23:46:10.845222 2509 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 15 23:46:10.847580 kubelet[2509]: E0115 23:46:10.847018 2509 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.10.219:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.10.219:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 15 23:46:10.847741 kubelet[2509]: I0115 23:46:10.847685 2509 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 23:46:10.847741 kubelet[2509]: I0115 23:46:10.847731 2509 server.go:1289] "Started kubelet" Jan 15 23:46:10.849153 kubelet[2509]: I0115 23:46:10.849124 2509 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 23:46:10.849301 kubelet[2509]: E0115 23:46:10.849272 2509 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.10.219:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-n-b7ec270451&limit=500&resourceVersion=0\": dial tcp 10.0.10.219:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 15 23:46:10.854269 kubelet[2509]: I0115 23:46:10.854218 2509 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 23:46:10.857457 kubelet[2509]: I0115 23:46:10.856654 2509 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 23:46:10.861629 kubelet[2509]: I0115 23:46:10.860566 2509 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 23:46:10.861629 kubelet[2509]: E0115 23:46:10.861031 2509 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-b7ec270451\" not found" Jan 15 23:46:10.861776 kubelet[2509]: I0115 23:46:10.861755 2509 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 23:46:10.861924 kubelet[2509]: I0115 23:46:10.861900 2509 reconciler.go:26] "Reconciler: start to sync state" Jan 15 23:46:10.862917 kubelet[2509]: E0115 23:46:10.862622 2509 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.10.219:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.10.219:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 15 23:46:10.862917 kubelet[2509]: E0115 23:46:10.862735 2509 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.219:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-b7ec270451?timeout=10s\": dial tcp 10.0.10.219:6443: connect: connection refused" interval="200ms" Jan 15 23:46:10.865773 kubelet[2509]: I0115 23:46:10.864921 2509 factory.go:223] Registration of the systemd container factory successfully Jan 15 23:46:10.865773 kubelet[2509]: I0115 23:46:10.865065 2509 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 23:46:10.866043 kubelet[2509]: I0115 23:46:10.865987 2509 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 23:46:10.866720 kubelet[2509]: E0115 23:46:10.862801 2509 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.10.219:6443/api/v1/namespaces/default/events\": dial tcp 10.0.10.219:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-n-b7ec270451.188b0c456446fc24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-n-b7ec270451,UID:ci-4459-2-2-n-b7ec270451,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-b7ec270451,},FirstTimestamp:2026-01-15 23:46:10.847702052 +0000 UTC m=+1.694686748,LastTimestamp:2026-01-15 23:46:10.847702052 +0000 UTC m=+1.694686748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-b7ec270451,}" Jan 15 23:46:10.866720 kubelet[2509]: I0115 23:46:10.866683 2509 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 23:46:10.866921 kubelet[2509]: E0115 23:46:10.866896 2509 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 23:46:10.867001 kubelet[2509]: I0115 23:46:10.866973 2509 server.go:317] "Adding debug handlers to kubelet server" Jan 15 23:46:10.867761 kubelet[2509]: I0115 23:46:10.867741 2509 factory.go:223] Registration of the containerd container factory successfully Jan 15 23:46:10.879244 kubelet[2509]: I0115 23:46:10.878989 2509 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 23:46:10.879244 kubelet[2509]: I0115 23:46:10.879006 2509 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 23:46:10.879244 kubelet[2509]: I0115 23:46:10.879025 2509 state_mem.go:36] "Initialized new in-memory state store" Jan 15 23:46:10.881217 kubelet[2509]: I0115 23:46:10.881195 2509 policy_none.go:49] "None policy: Start" Jan 15 23:46:10.881322 kubelet[2509]: I0115 23:46:10.881311 2509 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 23:46:10.881378 kubelet[2509]: I0115 23:46:10.881366 2509 state_mem.go:35] "Initializing new in-memory state store" Jan 15 23:46:10.888430 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 15 23:46:10.890418 kubelet[2509]: I0115 23:46:10.890369 2509 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 15 23:46:10.891405 kubelet[2509]: I0115 23:46:10.891361 2509 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 15 23:46:10.891405 kubelet[2509]: I0115 23:46:10.891383 2509 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 15 23:46:10.891405 kubelet[2509]: I0115 23:46:10.891408 2509 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 23:46:10.891405 kubelet[2509]: I0115 23:46:10.891414 2509 kubelet.go:2436] "Starting kubelet main sync loop" Jan 15 23:46:10.891567 kubelet[2509]: E0115 23:46:10.891532 2509 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 23:46:10.892163 kubelet[2509]: E0115 23:46:10.892070 2509 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.10.219:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.10.219:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 15 23:46:10.898538 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 15 23:46:10.901468 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 15 23:46:10.913421 kubelet[2509]: E0115 23:46:10.913371 2509 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 15 23:46:10.913624 kubelet[2509]: I0115 23:46:10.913601 2509 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 23:46:10.913658 kubelet[2509]: I0115 23:46:10.913623 2509 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 23:46:10.914678 kubelet[2509]: I0115 23:46:10.914255 2509 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 23:46:10.915074 kubelet[2509]: E0115 23:46:10.914866 2509 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 23:46:10.915248 kubelet[2509]: E0115 23:46:10.915175 2509 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-n-b7ec270451\" not found" Jan 15 23:46:11.001677 systemd[1]: Created slice kubepods-burstable-pod317e0c512642ef48debbbbeb1b8dc0bc.slice - libcontainer container kubepods-burstable-pod317e0c512642ef48debbbbeb1b8dc0bc.slice. Jan 15 23:46:11.015611 kubelet[2509]: I0115 23:46:11.015563 2509 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.016139 kubelet[2509]: E0115 23:46:11.016092 2509 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.219:6443/api/v1/nodes\": dial tcp 10.0.10.219:6443: connect: connection refused" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.028274 kubelet[2509]: E0115 23:46:11.028223 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.031481 systemd[1]: Created slice kubepods-burstable-pod0d35633e582bc9b906247ef1d1e6b3cd.slice - libcontainer container kubepods-burstable-pod0d35633e582bc9b906247ef1d1e6b3cd.slice. Jan 15 23:46:11.051777 kubelet[2509]: E0115 23:46:11.051740 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.054261 systemd[1]: Created slice kubepods-burstable-pod22205193bf6b9442abdf805753a824ce.slice - libcontainer container kubepods-burstable-pod22205193bf6b9442abdf805753a824ce.slice. Jan 15 23:46:11.056107 kubelet[2509]: E0115 23:46:11.056061 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.063574 kubelet[2509]: E0115 23:46:11.063531 2509 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.219:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-b7ec270451?timeout=10s\": dial tcp 10.0.10.219:6443: connect: connection refused" interval="400ms" Jan 15 23:46:11.164087 kubelet[2509]: I0115 23:46:11.164004 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164087 kubelet[2509]: I0115 23:46:11.164050 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164087 kubelet[2509]: I0115 23:46:11.164071 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164087 kubelet[2509]: I0115 23:46:11.164086 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164087 kubelet[2509]: I0115 23:46:11.164101 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164611 kubelet[2509]: I0115 23:46:11.164116 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22205193bf6b9442abdf805753a824ce-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-b7ec270451\" (UID: \"22205193bf6b9442abdf805753a824ce\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164611 kubelet[2509]: I0115 23:46:11.164130 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164611 kubelet[2509]: I0115 23:46:11.164143 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.164611 kubelet[2509]: I0115 23:46:11.164157 2509 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.218728 kubelet[2509]: I0115 23:46:11.218692 2509 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.219136 kubelet[2509]: E0115 23:46:11.219100 2509 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.219:6443/api/v1/nodes\": dial tcp 10.0.10.219:6443: connect: connection refused" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.329823 containerd[1623]: time="2026-01-15T23:46:11.329711300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-b7ec270451,Uid:317e0c512642ef48debbbbeb1b8dc0bc,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:11.347922 containerd[1623]: time="2026-01-15T23:46:11.347873828Z" level=info msg="connecting to shim 31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a" address="unix:///run/containerd/s/a765791fb0d29ad28e65f28fcc3a703541f2310246ce164a75e3270c66e0b212" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:11.353469 containerd[1623]: time="2026-01-15T23:46:11.353409495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-b7ec270451,Uid:0d35633e582bc9b906247ef1d1e6b3cd,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:11.357266 containerd[1623]: time="2026-01-15T23:46:11.357227473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-b7ec270451,Uid:22205193bf6b9442abdf805753a824ce,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:11.371613 systemd[1]: Started cri-containerd-31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a.scope - libcontainer container 31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a. Jan 15 23:46:11.402387 containerd[1623]: time="2026-01-15T23:46:11.402337891Z" level=info msg="connecting to shim 9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559" address="unix:///run/containerd/s/0daec9c26f6c15595453afc18e5b2e80a23c2a9d76994d50e02e87c3d6d5450c" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:11.416222 containerd[1623]: time="2026-01-15T23:46:11.416188718Z" level=info msg="connecting to shim f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771" address="unix:///run/containerd/s/f525541d0296420e40efa1381707f676935a79cf34336b1391fce3123491a4e2" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:11.417451 containerd[1623]: time="2026-01-15T23:46:11.417404564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-b7ec270451,Uid:317e0c512642ef48debbbbeb1b8dc0bc,Namespace:kube-system,Attempt:0,} returns sandbox id \"31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a\"" Jan 15 23:46:11.425506 containerd[1623]: time="2026-01-15T23:46:11.425403203Z" level=info msg="CreateContainer within sandbox \"31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 15 23:46:11.427641 systemd[1]: Started cri-containerd-9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559.scope - libcontainer container 9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559. Jan 15 23:46:11.436415 systemd[1]: Started cri-containerd-f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771.scope - libcontainer container f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771. Jan 15 23:46:11.438359 containerd[1623]: time="2026-01-15T23:46:11.437826423Z" level=info msg="Container a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:11.447258 containerd[1623]: time="2026-01-15T23:46:11.446302783Z" level=info msg="CreateContainer within sandbox \"31e95b3846a23a7416c0c26a536c14e33b595d5e5a5a988a7ecb70373d9a542a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026\"" Jan 15 23:46:11.447409 containerd[1623]: time="2026-01-15T23:46:11.447335148Z" level=info msg="StartContainer for \"a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026\"" Jan 15 23:46:11.449490 containerd[1623]: time="2026-01-15T23:46:11.449458559Z" level=info msg="connecting to shim a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026" address="unix:///run/containerd/s/a765791fb0d29ad28e65f28fcc3a703541f2310246ce164a75e3270c66e0b212" protocol=ttrpc version=3 Jan 15 23:46:11.463992 kubelet[2509]: E0115 23:46:11.463943 2509 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.219:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-b7ec270451?timeout=10s\": dial tcp 10.0.10.219:6443: connect: connection refused" interval="800ms" Jan 15 23:46:11.469679 systemd[1]: Started cri-containerd-a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026.scope - libcontainer container a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026. Jan 15 23:46:11.481104 containerd[1623]: time="2026-01-15T23:46:11.480998871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-b7ec270451,Uid:0d35633e582bc9b906247ef1d1e6b3cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559\"" Jan 15 23:46:11.482914 containerd[1623]: time="2026-01-15T23:46:11.482875920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-b7ec270451,Uid:22205193bf6b9442abdf805753a824ce,Namespace:kube-system,Attempt:0,} returns sandbox id \"f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771\"" Jan 15 23:46:11.486914 containerd[1623]: time="2026-01-15T23:46:11.486857299Z" level=info msg="CreateContainer within sandbox \"9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 15 23:46:11.489428 containerd[1623]: time="2026-01-15T23:46:11.489388952Z" level=info msg="CreateContainer within sandbox \"f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 15 23:46:11.496025 containerd[1623]: time="2026-01-15T23:46:11.495973903Z" level=info msg="Container 8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:11.506928 containerd[1623]: time="2026-01-15T23:46:11.506885396Z" level=info msg="Container 0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:11.513145 containerd[1623]: time="2026-01-15T23:46:11.513100226Z" level=info msg="CreateContainer within sandbox \"9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2\"" Jan 15 23:46:11.513794 containerd[1623]: time="2026-01-15T23:46:11.513754909Z" level=info msg="StartContainer for \"a0997c641d78ed54df55dfe4e9a226199205f99ad088c8f723e4f4261ebfe026\" returns successfully" Jan 15 23:46:11.513794 containerd[1623]: time="2026-01-15T23:46:11.513782149Z" level=info msg="StartContainer for \"8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2\"" Jan 15 23:46:11.515184 containerd[1623]: time="2026-01-15T23:46:11.515151996Z" level=info msg="connecting to shim 8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2" address="unix:///run/containerd/s/0daec9c26f6c15595453afc18e5b2e80a23c2a9d76994d50e02e87c3d6d5450c" protocol=ttrpc version=3 Jan 15 23:46:11.517014 containerd[1623]: time="2026-01-15T23:46:11.516925685Z" level=info msg="CreateContainer within sandbox \"f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca\"" Jan 15 23:46:11.517541 containerd[1623]: time="2026-01-15T23:46:11.517467567Z" level=info msg="StartContainer for \"0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca\"" Jan 15 23:46:11.518545 containerd[1623]: time="2026-01-15T23:46:11.518498092Z" level=info msg="connecting to shim 0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca" address="unix:///run/containerd/s/f525541d0296420e40efa1381707f676935a79cf34336b1391fce3123491a4e2" protocol=ttrpc version=3 Jan 15 23:46:11.535630 systemd[1]: Started cri-containerd-8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2.scope - libcontainer container 8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2. Jan 15 23:46:11.539677 systemd[1]: Started cri-containerd-0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca.scope - libcontainer container 0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca. Jan 15 23:46:11.590562 containerd[1623]: time="2026-01-15T23:46:11.590231879Z" level=info msg="StartContainer for \"0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca\" returns successfully" Jan 15 23:46:11.591189 containerd[1623]: time="2026-01-15T23:46:11.590785241Z" level=info msg="StartContainer for \"8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2\" returns successfully" Jan 15 23:46:11.621427 kubelet[2509]: I0115 23:46:11.621397 2509 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.904328 kubelet[2509]: E0115 23:46:11.903853 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.906992 kubelet[2509]: E0115 23:46:11.906969 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:11.908718 kubelet[2509]: E0115 23:46:11.908587 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:12.910200 kubelet[2509]: E0115 23:46:12.909840 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:12.910200 kubelet[2509]: E0115 23:46:12.909889 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:12.910200 kubelet[2509]: E0115 23:46:12.910096 2509 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.015919 kubelet[2509]: E0115 23:46:13.015886 2509 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-n-b7ec270451\" not found" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.197021 kubelet[2509]: I0115 23:46:13.196600 2509 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.261828 kubelet[2509]: I0115 23:46:13.261243 2509 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.268012 kubelet[2509]: E0115 23:46:13.267982 2509 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.268121 kubelet[2509]: I0115 23:46:13.268108 2509 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.273952 kubelet[2509]: E0115 23:46:13.273908 2509 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.274086 kubelet[2509]: I0115 23:46:13.274075 2509 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.276254 kubelet[2509]: E0115 23:46:13.276227 2509 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-n-b7ec270451\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:13.844741 kubelet[2509]: I0115 23:46:13.844705 2509 apiserver.go:52] "Watching apiserver" Jan 15 23:46:13.862160 kubelet[2509]: I0115 23:46:13.862122 2509 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 23:46:14.947490 kubelet[2509]: I0115 23:46:14.947340 2509 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:15.349247 systemd[1]: Reload requested from client PID 2794 ('systemctl') (unit session-11.scope)... Jan 15 23:46:15.349262 systemd[1]: Reloading... Jan 15 23:46:15.418487 zram_generator::config[2837]: No configuration found. Jan 15 23:46:15.596346 systemd[1]: Reloading finished in 246 ms. Jan 15 23:46:15.623748 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:46:15.640864 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 23:46:15.641123 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:15.641181 systemd[1]: kubelet.service: Consumed 1.395s CPU time, 129M memory peak. Jan 15 23:46:15.642807 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 23:46:15.792348 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 23:46:15.796674 (kubelet)[2882]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 23:46:15.828023 kubelet[2882]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 23:46:15.828023 kubelet[2882]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 23:46:15.828023 kubelet[2882]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 23:46:15.828383 kubelet[2882]: I0115 23:46:15.828068 2882 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 23:46:15.833533 kubelet[2882]: I0115 23:46:15.833492 2882 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 15 23:46:15.833533 kubelet[2882]: I0115 23:46:15.833526 2882 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 23:46:15.834612 kubelet[2882]: I0115 23:46:15.834585 2882 server.go:956] "Client rotation is on, will bootstrap in background" Jan 15 23:46:15.836669 kubelet[2882]: I0115 23:46:15.836507 2882 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 15 23:46:15.838825 kubelet[2882]: I0115 23:46:15.838805 2882 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 23:46:15.842683 kubelet[2882]: I0115 23:46:15.842617 2882 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 23:46:15.845221 kubelet[2882]: I0115 23:46:15.845202 2882 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 23:46:15.845399 kubelet[2882]: I0115 23:46:15.845379 2882 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 23:46:15.845595 kubelet[2882]: I0115 23:46:15.845401 2882 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-b7ec270451","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 23:46:15.845595 kubelet[2882]: I0115 23:46:15.845572 2882 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 23:46:15.845595 kubelet[2882]: I0115 23:46:15.845581 2882 container_manager_linux.go:303] "Creating device plugin manager" Jan 15 23:46:15.845748 kubelet[2882]: I0115 23:46:15.845616 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 15 23:46:15.845771 kubelet[2882]: I0115 23:46:15.845760 2882 kubelet.go:480] "Attempting to sync node with API server" Jan 15 23:46:15.845791 kubelet[2882]: I0115 23:46:15.845772 2882 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 23:46:15.845812 kubelet[2882]: I0115 23:46:15.845794 2882 kubelet.go:386] "Adding apiserver pod source" Jan 15 23:46:15.845812 kubelet[2882]: I0115 23:46:15.845806 2882 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 23:46:15.849170 kubelet[2882]: I0115 23:46:15.848539 2882 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 15 23:46:15.849170 kubelet[2882]: I0115 23:46:15.849085 2882 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 15 23:46:15.853876 kubelet[2882]: I0115 23:46:15.853848 2882 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 23:46:15.853949 kubelet[2882]: I0115 23:46:15.853898 2882 server.go:1289] "Started kubelet" Jan 15 23:46:15.856351 kubelet[2882]: I0115 23:46:15.856324 2882 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 23:46:15.859598 kubelet[2882]: I0115 23:46:15.859573 2882 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 23:46:15.859828 kubelet[2882]: E0115 23:46:15.859802 2882 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-b7ec270451\" not found" Jan 15 23:46:15.860222 kubelet[2882]: I0115 23:46:15.860196 2882 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 23:46:15.860339 kubelet[2882]: I0115 23:46:15.860324 2882 reconciler.go:26] "Reconciler: start to sync state" Jan 15 23:46:15.860701 kubelet[2882]: I0115 23:46:15.860665 2882 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 23:46:15.860805 kubelet[2882]: I0115 23:46:15.860766 2882 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 23:46:15.861050 kubelet[2882]: I0115 23:46:15.861021 2882 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 23:46:15.861142 kubelet[2882]: I0115 23:46:15.861121 2882 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 23:46:15.866377 kubelet[2882]: I0115 23:46:15.866354 2882 server.go:317] "Adding debug handlers to kubelet server" Jan 15 23:46:15.872012 kubelet[2882]: I0115 23:46:15.871983 2882 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 23:46:15.873837 kubelet[2882]: E0115 23:46:15.873766 2882 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 23:46:15.873837 kubelet[2882]: I0115 23:46:15.873778 2882 factory.go:223] Registration of the containerd container factory successfully Jan 15 23:46:15.873837 kubelet[2882]: I0115 23:46:15.873796 2882 factory.go:223] Registration of the systemd container factory successfully Jan 15 23:46:15.876892 kubelet[2882]: I0115 23:46:15.876676 2882 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 15 23:46:15.878879 kubelet[2882]: I0115 23:46:15.878848 2882 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 15 23:46:15.879017 kubelet[2882]: I0115 23:46:15.879005 2882 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 15 23:46:15.879097 kubelet[2882]: I0115 23:46:15.879084 2882 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 23:46:15.879724 kubelet[2882]: I0115 23:46:15.879708 2882 kubelet.go:2436] "Starting kubelet main sync loop" Jan 15 23:46:15.879848 kubelet[2882]: E0115 23:46:15.879814 2882 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 23:46:15.915038 kubelet[2882]: I0115 23:46:15.915011 2882 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915150 2882 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915175 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915307 2882 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915316 2882 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915333 2882 policy_none.go:49] "None policy: Start" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915341 2882 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915350 2882 state_mem.go:35] "Initializing new in-memory state store" Jan 15 23:46:15.915828 kubelet[2882]: I0115 23:46:15.915431 2882 state_mem.go:75] "Updated machine memory state" Jan 15 23:46:15.919184 kubelet[2882]: E0115 23:46:15.919147 2882 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 15 23:46:15.919340 kubelet[2882]: I0115 23:46:15.919323 2882 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 23:46:15.919369 kubelet[2882]: I0115 23:46:15.919341 2882 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 23:46:15.919768 kubelet[2882]: I0115 23:46:15.919710 2882 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 23:46:15.921189 kubelet[2882]: E0115 23:46:15.921158 2882 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 23:46:15.981139 kubelet[2882]: I0115 23:46:15.981090 2882 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:15.981274 kubelet[2882]: I0115 23:46:15.981233 2882 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:15.982474 kubelet[2882]: I0115 23:46:15.981901 2882 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:15.989088 kubelet[2882]: E0115 23:46:15.989047 2882 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.022449 kubelet[2882]: I0115 23:46:16.022412 2882 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.030274 kubelet[2882]: I0115 23:46:16.030228 2882 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.030361 kubelet[2882]: I0115 23:46:16.030335 2882 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062104 kubelet[2882]: I0115 23:46:16.062065 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062104 kubelet[2882]: I0115 23:46:16.062105 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062265 kubelet[2882]: I0115 23:46:16.062130 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062265 kubelet[2882]: I0115 23:46:16.062144 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062265 kubelet[2882]: I0115 23:46:16.062161 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062265 kubelet[2882]: I0115 23:46:16.062176 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062265 kubelet[2882]: I0115 23:46:16.062217 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0d35633e582bc9b906247ef1d1e6b3cd-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" (UID: \"0d35633e582bc9b906247ef1d1e6b3cd\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062368 kubelet[2882]: I0115 23:46:16.062263 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22205193bf6b9442abdf805753a824ce-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-b7ec270451\" (UID: \"22205193bf6b9442abdf805753a824ce\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.062368 kubelet[2882]: I0115 23:46:16.062330 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/317e0c512642ef48debbbbeb1b8dc0bc-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" (UID: \"317e0c512642ef48debbbbeb1b8dc0bc\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.846751 kubelet[2882]: I0115 23:46:16.846711 2882 apiserver.go:52] "Watching apiserver" Jan 15 23:46:16.860958 kubelet[2882]: I0115 23:46:16.860820 2882 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 23:46:16.897481 kubelet[2882]: I0115 23:46:16.897289 2882 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.898010 kubelet[2882]: I0115 23:46:16.897372 2882 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.904633 kubelet[2882]: E0115 23:46:16.904598 2882 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-n-b7ec270451\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.905504 kubelet[2882]: E0115 23:46:16.905477 2882 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-b7ec270451\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" Jan 15 23:46:16.927702 kubelet[2882]: I0115 23:46:16.927645 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-n-b7ec270451" podStartSLOduration=2.9276268229999998 podStartE2EDuration="2.927626823s" podCreationTimestamp="2026-01-15 23:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:46:16.917841856 +0000 UTC m=+1.117837201" watchObservedRunningTime="2026-01-15 23:46:16.927626823 +0000 UTC m=+1.127622128" Jan 15 23:46:16.939902 kubelet[2882]: I0115 23:46:16.939838 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-n-b7ec270451" podStartSLOduration=1.9398119619999998 podStartE2EDuration="1.939811962s" podCreationTimestamp="2026-01-15 23:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:46:16.928127026 +0000 UTC m=+1.128122371" watchObservedRunningTime="2026-01-15 23:46:16.939811962 +0000 UTC m=+1.139807307" Jan 15 23:46:16.951640 kubelet[2882]: I0115 23:46:16.951519 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-b7ec270451" podStartSLOduration=1.951491979 podStartE2EDuration="1.951491979s" podCreationTimestamp="2026-01-15 23:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:46:16.939976243 +0000 UTC m=+1.139971548" watchObservedRunningTime="2026-01-15 23:46:16.951491979 +0000 UTC m=+1.151487324" Jan 15 23:46:21.381109 kubelet[2882]: I0115 23:46:21.381078 2882 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 15 23:46:21.381804 containerd[1623]: time="2026-01-15T23:46:21.381716701Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 15 23:46:21.382022 kubelet[2882]: I0115 23:46:21.381957 2882 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 15 23:46:22.297154 systemd[1]: Created slice kubepods-besteffort-pod15b0b211_e821_4711_9d15_9eb5babbc470.slice - libcontainer container kubepods-besteffort-pod15b0b211_e821_4711_9d15_9eb5babbc470.slice. Jan 15 23:46:22.303125 kubelet[2882]: I0115 23:46:22.303072 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/15b0b211-e821-4711-9d15-9eb5babbc470-xtables-lock\") pod \"kube-proxy-lxg2g\" (UID: \"15b0b211-e821-4711-9d15-9eb5babbc470\") " pod="kube-system/kube-proxy-lxg2g" Jan 15 23:46:22.303125 kubelet[2882]: I0115 23:46:22.303123 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/15b0b211-e821-4711-9d15-9eb5babbc470-lib-modules\") pod \"kube-proxy-lxg2g\" (UID: \"15b0b211-e821-4711-9d15-9eb5babbc470\") " pod="kube-system/kube-proxy-lxg2g" Jan 15 23:46:22.303285 kubelet[2882]: I0115 23:46:22.303142 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/15b0b211-e821-4711-9d15-9eb5babbc470-kube-proxy\") pod \"kube-proxy-lxg2g\" (UID: \"15b0b211-e821-4711-9d15-9eb5babbc470\") " pod="kube-system/kube-proxy-lxg2g" Jan 15 23:46:22.303285 kubelet[2882]: I0115 23:46:22.303159 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dnmp\" (UniqueName: \"kubernetes.io/projected/15b0b211-e821-4711-9d15-9eb5babbc470-kube-api-access-4dnmp\") pod \"kube-proxy-lxg2g\" (UID: \"15b0b211-e821-4711-9d15-9eb5babbc470\") " pod="kube-system/kube-proxy-lxg2g" Jan 15 23:46:22.571548 systemd[1]: Created slice kubepods-besteffort-pod6485e0fd_2146_4cba_a391_bc44d6a007bf.slice - libcontainer container kubepods-besteffort-pod6485e0fd_2146_4cba_a391_bc44d6a007bf.slice. Jan 15 23:46:22.605379 kubelet[2882]: I0115 23:46:22.605284 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hxt\" (UniqueName: \"kubernetes.io/projected/6485e0fd-2146-4cba-a391-bc44d6a007bf-kube-api-access-t5hxt\") pod \"tigera-operator-7dcd859c48-mvm85\" (UID: \"6485e0fd-2146-4cba-a391-bc44d6a007bf\") " pod="tigera-operator/tigera-operator-7dcd859c48-mvm85" Jan 15 23:46:22.605379 kubelet[2882]: I0115 23:46:22.605340 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6485e0fd-2146-4cba-a391-bc44d6a007bf-var-lib-calico\") pod \"tigera-operator-7dcd859c48-mvm85\" (UID: \"6485e0fd-2146-4cba-a391-bc44d6a007bf\") " pod="tigera-operator/tigera-operator-7dcd859c48-mvm85" Jan 15 23:46:22.613093 containerd[1623]: time="2026-01-15T23:46:22.613027809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lxg2g,Uid:15b0b211-e821-4711-9d15-9eb5babbc470,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:22.633235 containerd[1623]: time="2026-01-15T23:46:22.633186827Z" level=info msg="connecting to shim a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b" address="unix:///run/containerd/s/a868449f988665a1d1c4b28a1be264379ca6e6c1183ae2d4c9e191dbe085a726" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:22.655639 systemd[1]: Started cri-containerd-a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b.scope - libcontainer container a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b. Jan 15 23:46:22.679945 containerd[1623]: time="2026-01-15T23:46:22.679835172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lxg2g,Uid:15b0b211-e821-4711-9d15-9eb5babbc470,Namespace:kube-system,Attempt:0,} returns sandbox id \"a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b\"" Jan 15 23:46:22.685450 containerd[1623]: time="2026-01-15T23:46:22.685402759Z" level=info msg="CreateContainer within sandbox \"a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 15 23:46:22.697510 containerd[1623]: time="2026-01-15T23:46:22.697422697Z" level=info msg="Container 47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:22.713195 containerd[1623]: time="2026-01-15T23:46:22.713146493Z" level=info msg="CreateContainer within sandbox \"a8d71ac0adfd1682bdc886806757d6cc8394d648a9926a9c83e57e626313d18b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9\"" Jan 15 23:46:22.714721 containerd[1623]: time="2026-01-15T23:46:22.714688020Z" level=info msg="StartContainer for \"47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9\"" Jan 15 23:46:22.716331 containerd[1623]: time="2026-01-15T23:46:22.716299308Z" level=info msg="connecting to shim 47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9" address="unix:///run/containerd/s/a868449f988665a1d1c4b28a1be264379ca6e6c1183ae2d4c9e191dbe085a726" protocol=ttrpc version=3 Jan 15 23:46:22.736683 systemd[1]: Started cri-containerd-47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9.scope - libcontainer container 47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9. Jan 15 23:46:22.814003 containerd[1623]: time="2026-01-15T23:46:22.813822699Z" level=info msg="StartContainer for \"47ed721c0347a91e60bd3f755860efea1c3f7191a2158f9243bcb4f005495de9\" returns successfully" Jan 15 23:46:22.874830 containerd[1623]: time="2026-01-15T23:46:22.874717714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mvm85,Uid:6485e0fd-2146-4cba-a391-bc44d6a007bf,Namespace:tigera-operator,Attempt:0,}" Jan 15 23:46:22.896059 containerd[1623]: time="2026-01-15T23:46:22.895975936Z" level=info msg="connecting to shim afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2" address="unix:///run/containerd/s/c7339ee98f994329a9d7949d11eabc95d60e15ee8fd216ab9e0afa6e840de808" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:22.922867 systemd[1]: Started cri-containerd-afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2.scope - libcontainer container afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2. Jan 15 23:46:22.923428 kubelet[2882]: I0115 23:46:22.923284 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lxg2g" podStartSLOduration=0.923266148 podStartE2EDuration="923.266148ms" podCreationTimestamp="2026-01-15 23:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:46:22.921843901 +0000 UTC m=+7.121839246" watchObservedRunningTime="2026-01-15 23:46:22.923266148 +0000 UTC m=+7.123261533" Jan 15 23:46:22.967200 containerd[1623]: time="2026-01-15T23:46:22.967028799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mvm85,Uid:6485e0fd-2146-4cba-a391-bc44d6a007bf,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2\"" Jan 15 23:46:22.971492 containerd[1623]: time="2026-01-15T23:46:22.971278820Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 15 23:46:25.131726 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4189532549.mount: Deactivated successfully. Jan 15 23:46:25.406015 containerd[1623]: time="2026-01-15T23:46:25.405901582Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:25.406995 containerd[1623]: time="2026-01-15T23:46:25.406807946Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Jan 15 23:46:25.407976 containerd[1623]: time="2026-01-15T23:46:25.407943271Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:25.410222 containerd[1623]: time="2026-01-15T23:46:25.410187442Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:25.411006 containerd[1623]: time="2026-01-15T23:46:25.410888366Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.439562546s" Jan 15 23:46:25.411006 containerd[1623]: time="2026-01-15T23:46:25.410922806Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 15 23:46:25.415954 containerd[1623]: time="2026-01-15T23:46:25.415920550Z" level=info msg="CreateContainer within sandbox \"afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 15 23:46:25.422754 containerd[1623]: time="2026-01-15T23:46:25.422194060Z" level=info msg="Container 970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:25.429195 containerd[1623]: time="2026-01-15T23:46:25.429151414Z" level=info msg="CreateContainer within sandbox \"afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\"" Jan 15 23:46:25.430070 containerd[1623]: time="2026-01-15T23:46:25.430033818Z" level=info msg="StartContainer for \"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\"" Jan 15 23:46:25.430908 containerd[1623]: time="2026-01-15T23:46:25.430874462Z" level=info msg="connecting to shim 970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200" address="unix:///run/containerd/s/c7339ee98f994329a9d7949d11eabc95d60e15ee8fd216ab9e0afa6e840de808" protocol=ttrpc version=3 Jan 15 23:46:25.446811 systemd[1]: Started cri-containerd-970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200.scope - libcontainer container 970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200. Jan 15 23:46:25.471395 containerd[1623]: time="2026-01-15T23:46:25.471355138Z" level=info msg="StartContainer for \"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\" returns successfully" Jan 15 23:46:25.932518 kubelet[2882]: I0115 23:46:25.932201 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-mvm85" podStartSLOduration=1.489824165 podStartE2EDuration="3.932171604s" podCreationTimestamp="2026-01-15 23:46:22 +0000 UTC" firstStartedPulling="2026-01-15 23:46:22.96927477 +0000 UTC m=+7.169270075" lastFinishedPulling="2026-01-15 23:46:25.411622169 +0000 UTC m=+9.611617514" observedRunningTime="2026-01-15 23:46:25.931623961 +0000 UTC m=+10.131619306" watchObservedRunningTime="2026-01-15 23:46:25.932171604 +0000 UTC m=+10.132166949" Jan 15 23:46:30.824976 sudo[1912]: pam_unix(sudo:session): session closed for user root Jan 15 23:46:30.921574 sshd[1911]: Connection closed by 68.220.241.50 port 35840 Jan 15 23:46:30.921934 sshd-session[1908]: pam_unix(sshd:session): session closed for user core Jan 15 23:46:30.929130 systemd[1]: session-11.scope: Deactivated successfully. Jan 15 23:46:30.929601 systemd[1]: session-11.scope: Consumed 7.209s CPU time, 222.7M memory peak. Jan 15 23:46:30.930722 systemd[1]: sshd@10-10.0.10.219:22-68.220.241.50:35840.service: Deactivated successfully. Jan 15 23:46:30.932700 systemd-logind[1599]: Session 11 logged out. Waiting for processes to exit. Jan 15 23:46:30.935286 systemd-logind[1599]: Removed session 11. Jan 15 23:46:37.487756 systemd[1]: Created slice kubepods-besteffort-pod7b003410_8e9f_4334_a339_7ba09d29ac13.slice - libcontainer container kubepods-besteffort-pod7b003410_8e9f_4334_a339_7ba09d29ac13.slice. Jan 15 23:46:37.503300 kubelet[2882]: I0115 23:46:37.503215 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7b003410-8e9f-4334-a339-7ba09d29ac13-typha-certs\") pod \"calico-typha-6948775fc9-5nvhq\" (UID: \"7b003410-8e9f-4334-a339-7ba09d29ac13\") " pod="calico-system/calico-typha-6948775fc9-5nvhq" Jan 15 23:46:37.503300 kubelet[2882]: I0115 23:46:37.503282 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b003410-8e9f-4334-a339-7ba09d29ac13-tigera-ca-bundle\") pod \"calico-typha-6948775fc9-5nvhq\" (UID: \"7b003410-8e9f-4334-a339-7ba09d29ac13\") " pod="calico-system/calico-typha-6948775fc9-5nvhq" Jan 15 23:46:37.503715 kubelet[2882]: I0115 23:46:37.503355 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw68k\" (UniqueName: \"kubernetes.io/projected/7b003410-8e9f-4334-a339-7ba09d29ac13-kube-api-access-cw68k\") pod \"calico-typha-6948775fc9-5nvhq\" (UID: \"7b003410-8e9f-4334-a339-7ba09d29ac13\") " pod="calico-system/calico-typha-6948775fc9-5nvhq" Jan 15 23:46:37.675674 systemd[1]: Created slice kubepods-besteffort-pod32a0b7fc_47ac_494a_9824_824538c91fa4.slice - libcontainer container kubepods-besteffort-pod32a0b7fc_47ac_494a_9824_824538c91fa4.slice. Jan 15 23:46:37.705445 kubelet[2882]: I0115 23:46:37.705307 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-flexvol-driver-host\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705445 kubelet[2882]: I0115 23:46:37.705406 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-xtables-lock\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705705 kubelet[2882]: I0115 23:46:37.705482 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29d2\" (UniqueName: \"kubernetes.io/projected/32a0b7fc-47ac-494a-9824-824538c91fa4-kube-api-access-q29d2\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705705 kubelet[2882]: I0115 23:46:37.705502 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/32a0b7fc-47ac-494a-9824-824538c91fa4-node-certs\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705705 kubelet[2882]: I0115 23:46:37.705539 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-policysync\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705705 kubelet[2882]: I0115 23:46:37.705553 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-cni-net-dir\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705705 kubelet[2882]: I0115 23:46:37.705568 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-lib-modules\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705828 kubelet[2882]: I0115 23:46:37.705582 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32a0b7fc-47ac-494a-9824-824538c91fa4-tigera-ca-bundle\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705828 kubelet[2882]: I0115 23:46:37.705599 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-cni-log-dir\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705828 kubelet[2882]: I0115 23:46:37.705627 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-var-lib-calico\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705828 kubelet[2882]: I0115 23:46:37.705709 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-cni-bin-dir\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.705828 kubelet[2882]: I0115 23:46:37.705734 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/32a0b7fc-47ac-494a-9824-824538c91fa4-var-run-calico\") pod \"calico-node-zzvbk\" (UID: \"32a0b7fc-47ac-494a-9824-824538c91fa4\") " pod="calico-system/calico-node-zzvbk" Jan 15 23:46:37.794785 containerd[1623]: time="2026-01-15T23:46:37.794657790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6948775fc9-5nvhq,Uid:7b003410-8e9f-4334-a339-7ba09d29ac13,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:37.809706 kubelet[2882]: E0115 23:46:37.809505 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.809706 kubelet[2882]: W0115 23:46:37.809531 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.809706 kubelet[2882]: E0115 23:46:37.809552 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.818788 kubelet[2882]: E0115 23:46:37.818581 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.819268 kubelet[2882]: W0115 23:46:37.819229 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.819418 kubelet[2882]: E0115 23:46:37.819398 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.821069 containerd[1623]: time="2026-01-15T23:46:37.820985837Z" level=info msg="connecting to shim fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a" address="unix:///run/containerd/s/d59758da8c17de164299e41efb9a7947a0a79d36edfd545c188f850d3264b815" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:37.822783 kubelet[2882]: E0115 23:46:37.822750 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.822783 kubelet[2882]: W0115 23:46:37.822771 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.822783 kubelet[2882]: E0115 23:46:37.822790 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.852612 systemd[1]: Started cri-containerd-fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a.scope - libcontainer container fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a. Jan 15 23:46:37.859389 kubelet[2882]: E0115 23:46:37.859329 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:37.893720 kubelet[2882]: E0115 23:46:37.893692 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.893720 kubelet[2882]: W0115 23:46:37.893714 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.893915 kubelet[2882]: E0115 23:46:37.893733 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.893915 kubelet[2882]: E0115 23:46:37.893893 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.893991 kubelet[2882]: W0115 23:46:37.893901 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.893991 kubelet[2882]: E0115 23:46:37.893940 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894072 kubelet[2882]: E0115 23:46:37.894059 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894072 kubelet[2882]: W0115 23:46:37.894069 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894149 kubelet[2882]: E0115 23:46:37.894076 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894206 kubelet[2882]: E0115 23:46:37.894195 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894206 kubelet[2882]: W0115 23:46:37.894204 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894283 kubelet[2882]: E0115 23:46:37.894211 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894350 kubelet[2882]: E0115 23:46:37.894340 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894350 kubelet[2882]: W0115 23:46:37.894349 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894423 kubelet[2882]: E0115 23:46:37.894357 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894495 kubelet[2882]: E0115 23:46:37.894483 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894495 kubelet[2882]: W0115 23:46:37.894494 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894566 kubelet[2882]: E0115 23:46:37.894502 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894689 kubelet[2882]: E0115 23:46:37.894621 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894689 kubelet[2882]: W0115 23:46:37.894629 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894689 kubelet[2882]: E0115 23:46:37.894636 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894779 kubelet[2882]: E0115 23:46:37.894750 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894779 kubelet[2882]: W0115 23:46:37.894756 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894779 kubelet[2882]: E0115 23:46:37.894764 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.894904 kubelet[2882]: E0115 23:46:37.894894 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.894904 kubelet[2882]: W0115 23:46:37.894904 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.894985 kubelet[2882]: E0115 23:46:37.894911 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895035 kubelet[2882]: E0115 23:46:37.895024 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895035 kubelet[2882]: W0115 23:46:37.895033 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895091 kubelet[2882]: E0115 23:46:37.895041 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895091 kubelet[2882]: E0115 23:46:37.895164 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895091 kubelet[2882]: W0115 23:46:37.895171 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895091 kubelet[2882]: E0115 23:46:37.895179 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895679 kubelet[2882]: E0115 23:46:37.895348 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895679 kubelet[2882]: W0115 23:46:37.895357 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895679 kubelet[2882]: E0115 23:46:37.895365 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895679 kubelet[2882]: E0115 23:46:37.895530 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895679 kubelet[2882]: W0115 23:46:37.895538 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895679 kubelet[2882]: E0115 23:46:37.895552 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895795 kubelet[2882]: E0115 23:46:37.895694 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895795 kubelet[2882]: W0115 23:46:37.895703 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895795 kubelet[2882]: E0115 23:46:37.895710 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.895852 kubelet[2882]: E0115 23:46:37.895846 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.895875 kubelet[2882]: W0115 23:46:37.895853 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.895875 kubelet[2882]: E0115 23:46:37.895861 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.896237 kubelet[2882]: E0115 23:46:37.895982 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.896237 kubelet[2882]: W0115 23:46:37.895992 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.896237 kubelet[2882]: E0115 23:46:37.896000 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.896237 kubelet[2882]: E0115 23:46:37.896140 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.896237 kubelet[2882]: W0115 23:46:37.896150 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.896237 kubelet[2882]: E0115 23:46:37.896174 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896299 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.896657 kubelet[2882]: W0115 23:46:37.896361 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896372 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896493 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.896657 kubelet[2882]: W0115 23:46:37.896501 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896508 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896621 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.896657 kubelet[2882]: W0115 23:46:37.896628 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.896657 kubelet[2882]: E0115 23:46:37.896634 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.901511 containerd[1623]: time="2026-01-15T23:46:37.901183504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6948775fc9-5nvhq,Uid:7b003410-8e9f-4334-a339-7ba09d29ac13,Namespace:calico-system,Attempt:0,} returns sandbox id \"fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a\"" Jan 15 23:46:37.903968 containerd[1623]: time="2026-01-15T23:46:37.903886197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 15 23:46:37.907623 kubelet[2882]: E0115 23:46:37.907592 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.907623 kubelet[2882]: W0115 23:46:37.907613 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.907733 kubelet[2882]: E0115 23:46:37.907630 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.907733 kubelet[2882]: I0115 23:46:37.907655 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fee8d1af-3972-419e-8500-84b3b6b46b71-varrun\") pod \"csi-node-driver-zqjh8\" (UID: \"fee8d1af-3972-419e-8500-84b3b6b46b71\") " pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:37.908457 kubelet[2882]: E0115 23:46:37.907849 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908457 kubelet[2882]: W0115 23:46:37.907862 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908457 kubelet[2882]: E0115 23:46:37.907871 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.908457 kubelet[2882]: I0115 23:46:37.907893 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqb9d\" (UniqueName: \"kubernetes.io/projected/fee8d1af-3972-419e-8500-84b3b6b46b71-kube-api-access-rqb9d\") pod \"csi-node-driver-zqjh8\" (UID: \"fee8d1af-3972-419e-8500-84b3b6b46b71\") " pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:37.908457 kubelet[2882]: E0115 23:46:37.908048 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908457 kubelet[2882]: W0115 23:46:37.908057 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908457 kubelet[2882]: E0115 23:46:37.908066 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.908457 kubelet[2882]: I0115 23:46:37.908085 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fee8d1af-3972-419e-8500-84b3b6b46b71-kubelet-dir\") pod \"csi-node-driver-zqjh8\" (UID: \"fee8d1af-3972-419e-8500-84b3b6b46b71\") " pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:37.908457 kubelet[2882]: E0115 23:46:37.908234 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908745 kubelet[2882]: W0115 23:46:37.908243 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908745 kubelet[2882]: E0115 23:46:37.908251 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.908745 kubelet[2882]: I0115 23:46:37.908269 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fee8d1af-3972-419e-8500-84b3b6b46b71-registration-dir\") pod \"csi-node-driver-zqjh8\" (UID: \"fee8d1af-3972-419e-8500-84b3b6b46b71\") " pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:37.908745 kubelet[2882]: E0115 23:46:37.908413 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908745 kubelet[2882]: W0115 23:46:37.908423 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908745 kubelet[2882]: E0115 23:46:37.908465 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.908745 kubelet[2882]: I0115 23:46:37.908492 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fee8d1af-3972-419e-8500-84b3b6b46b71-socket-dir\") pod \"csi-node-driver-zqjh8\" (UID: \"fee8d1af-3972-419e-8500-84b3b6b46b71\") " pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:37.908745 kubelet[2882]: E0115 23:46:37.908728 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908924 kubelet[2882]: W0115 23:46:37.908739 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908924 kubelet[2882]: E0115 23:46:37.908748 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.908968 kubelet[2882]: E0115 23:46:37.908937 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.908968 kubelet[2882]: W0115 23:46:37.908946 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.908968 kubelet[2882]: E0115 23:46:37.908954 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.909441 kubelet[2882]: E0115 23:46:37.909135 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.909441 kubelet[2882]: W0115 23:46:37.909146 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.909441 kubelet[2882]: E0115 23:46:37.909155 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.909441 kubelet[2882]: E0115 23:46:37.909315 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.909441 kubelet[2882]: W0115 23:46:37.909323 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.909441 kubelet[2882]: E0115 23:46:37.909330 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.909567 kubelet[2882]: E0115 23:46:37.909507 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.909567 kubelet[2882]: W0115 23:46:37.909515 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.909567 kubelet[2882]: E0115 23:46:37.909523 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.909692 kubelet[2882]: E0115 23:46:37.909676 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.909692 kubelet[2882]: W0115 23:46:37.909687 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.909739 kubelet[2882]: E0115 23:46:37.909695 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.909835 kubelet[2882]: E0115 23:46:37.909820 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.909835 kubelet[2882]: W0115 23:46:37.909830 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.909886 kubelet[2882]: E0115 23:46:37.909838 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.909985 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.910600 kubelet[2882]: W0115 23:46:37.909995 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.910003 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.910159 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.910600 kubelet[2882]: W0115 23:46:37.910166 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.910173 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.910293 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:37.910600 kubelet[2882]: W0115 23:46:37.910300 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:37.910600 kubelet[2882]: E0115 23:46:37.910307 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:37.979125 containerd[1623]: time="2026-01-15T23:46:37.979022320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zzvbk,Uid:32a0b7fc-47ac-494a-9824-824538c91fa4,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:38.000576 containerd[1623]: time="2026-01-15T23:46:38.000459664Z" level=info msg="connecting to shim 9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf" address="unix:///run/containerd/s/9e5f056a1874f311b505cde997a4579f24e79e7e5a18db941a008cfa12c21bb4" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:38.010004 kubelet[2882]: E0115 23:46:38.009855 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.010004 kubelet[2882]: W0115 23:46:38.009876 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.010004 kubelet[2882]: E0115 23:46:38.009895 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.010223 kubelet[2882]: E0115 23:46:38.010211 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.010279 kubelet[2882]: W0115 23:46:38.010267 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.010344 kubelet[2882]: E0115 23:46:38.010333 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.010712 kubelet[2882]: E0115 23:46:38.010696 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.010795 kubelet[2882]: W0115 23:46:38.010783 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.010847 kubelet[2882]: E0115 23:46:38.010835 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.011315 kubelet[2882]: E0115 23:46:38.011281 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.011315 kubelet[2882]: W0115 23:46:38.011299 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.011315 kubelet[2882]: E0115 23:46:38.011313 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.011651 kubelet[2882]: E0115 23:46:38.011638 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.011651 kubelet[2882]: W0115 23:46:38.011651 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.011720 kubelet[2882]: E0115 23:46:38.011663 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.011995 kubelet[2882]: E0115 23:46:38.011981 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.011995 kubelet[2882]: W0115 23:46:38.011995 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.012050 kubelet[2882]: E0115 23:46:38.012006 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.012430 kubelet[2882]: E0115 23:46:38.012333 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.012430 kubelet[2882]: W0115 23:46:38.012347 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.012430 kubelet[2882]: E0115 23:46:38.012358 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.012792 kubelet[2882]: E0115 23:46:38.012775 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.012792 kubelet[2882]: W0115 23:46:38.012788 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.012929 kubelet[2882]: E0115 23:46:38.012800 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.013240 kubelet[2882]: E0115 23:46:38.013221 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.013275 kubelet[2882]: W0115 23:46:38.013240 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.013275 kubelet[2882]: E0115 23:46:38.013253 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.013646 kubelet[2882]: E0115 23:46:38.013604 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.013646 kubelet[2882]: W0115 23:46:38.013619 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.013646 kubelet[2882]: E0115 23:46:38.013630 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.013798 kubelet[2882]: E0115 23:46:38.013780 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.013798 kubelet[2882]: W0115 23:46:38.013791 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.013798 kubelet[2882]: E0115 23:46:38.013799 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.014583 kubelet[2882]: E0115 23:46:38.014169 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.014583 kubelet[2882]: W0115 23:46:38.014181 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.014583 kubelet[2882]: E0115 23:46:38.014191 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.014675 kubelet[2882]: E0115 23:46:38.014652 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.014675 kubelet[2882]: W0115 23:46:38.014668 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.014804 kubelet[2882]: E0115 23:46:38.014680 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.015108 kubelet[2882]: E0115 23:46:38.015011 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.015108 kubelet[2882]: W0115 23:46:38.015027 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.015108 kubelet[2882]: E0115 23:46:38.015039 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.015216 kubelet[2882]: E0115 23:46:38.015183 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.015216 kubelet[2882]: W0115 23:46:38.015191 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.015216 kubelet[2882]: E0115 23:46:38.015199 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.016077 kubelet[2882]: E0115 23:46:38.015356 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.016077 kubelet[2882]: W0115 23:46:38.016032 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.016077 kubelet[2882]: E0115 23:46:38.016053 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.016681 kubelet[2882]: E0115 23:46:38.016657 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.016681 kubelet[2882]: W0115 23:46:38.016678 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.016681 kubelet[2882]: E0115 23:46:38.016691 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.017241 kubelet[2882]: E0115 23:46:38.016879 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.017241 kubelet[2882]: W0115 23:46:38.016893 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.017241 kubelet[2882]: E0115 23:46:38.016912 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.017241 kubelet[2882]: E0115 23:46:38.017110 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.017241 kubelet[2882]: W0115 23:46:38.017120 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.017241 kubelet[2882]: E0115 23:46:38.017130 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.017846 kubelet[2882]: E0115 23:46:38.017522 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.017846 kubelet[2882]: W0115 23:46:38.017534 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.017846 kubelet[2882]: E0115 23:46:38.017545 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.017846 kubelet[2882]: E0115 23:46:38.017700 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.017846 kubelet[2882]: W0115 23:46:38.017783 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.017846 kubelet[2882]: E0115 23:46:38.017801 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.018412 kubelet[2882]: E0115 23:46:38.018050 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.018412 kubelet[2882]: W0115 23:46:38.018062 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.018412 kubelet[2882]: E0115 23:46:38.018072 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.019311 kubelet[2882]: E0115 23:46:38.019286 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.019311 kubelet[2882]: W0115 23:46:38.019309 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.019396 kubelet[2882]: E0115 23:46:38.019328 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.019648 kubelet[2882]: E0115 23:46:38.019632 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.019648 kubelet[2882]: W0115 23:46:38.019646 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.019726 kubelet[2882]: E0115 23:46:38.019655 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.019869 kubelet[2882]: E0115 23:46:38.019858 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.019869 kubelet[2882]: W0115 23:46:38.019868 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.019925 kubelet[2882]: E0115 23:46:38.019877 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.020659 systemd[1]: Started cri-containerd-9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf.scope - libcontainer container 9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf. Jan 15 23:46:38.030301 kubelet[2882]: E0115 23:46:38.030227 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:38.030301 kubelet[2882]: W0115 23:46:38.030248 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:38.030301 kubelet[2882]: E0115 23:46:38.030264 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:38.043970 containerd[1623]: time="2026-01-15T23:46:38.043933994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zzvbk,Uid:32a0b7fc-47ac-494a-9824-824538c91fa4,Namespace:calico-system,Attempt:0,} returns sandbox id \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\"" Jan 15 23:46:39.452619 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3342712731.mount: Deactivated successfully. Jan 15 23:46:39.880702 kubelet[2882]: E0115 23:46:39.880451 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:40.587285 containerd[1623]: time="2026-01-15T23:46:40.586901959Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:40.587616 containerd[1623]: time="2026-01-15T23:46:40.587568122Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Jan 15 23:46:40.589466 containerd[1623]: time="2026-01-15T23:46:40.588897769Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:40.590737 containerd[1623]: time="2026-01-15T23:46:40.590695417Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:40.591495 containerd[1623]: time="2026-01-15T23:46:40.591471301Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.687546823s" Jan 15 23:46:40.591527 containerd[1623]: time="2026-01-15T23:46:40.591505141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 15 23:46:40.592329 containerd[1623]: time="2026-01-15T23:46:40.592308665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 15 23:46:40.602105 containerd[1623]: time="2026-01-15T23:46:40.602043592Z" level=info msg="CreateContainer within sandbox \"fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 23:46:40.609673 containerd[1623]: time="2026-01-15T23:46:40.609621469Z" level=info msg="Container 7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:40.620583 containerd[1623]: time="2026-01-15T23:46:40.620543682Z" level=info msg="CreateContainer within sandbox \"fa392ea2a550a9edfef4e61d349e3a82f216316b83606758e6715564bab2114a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa\"" Jan 15 23:46:40.621287 containerd[1623]: time="2026-01-15T23:46:40.621255845Z" level=info msg="StartContainer for \"7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa\"" Jan 15 23:46:40.622530 containerd[1623]: time="2026-01-15T23:46:40.622503091Z" level=info msg="connecting to shim 7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa" address="unix:///run/containerd/s/d59758da8c17de164299e41efb9a7947a0a79d36edfd545c188f850d3264b815" protocol=ttrpc version=3 Jan 15 23:46:40.651618 systemd[1]: Started cri-containerd-7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa.scope - libcontainer container 7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa. Jan 15 23:46:40.688422 containerd[1623]: time="2026-01-15T23:46:40.688386889Z" level=info msg="StartContainer for \"7b9942d0f433cc6ab338e069cb806b08496083332716e5aa6fe9daade2841bfa\" returns successfully" Jan 15 23:46:40.971193 kubelet[2882]: I0115 23:46:40.971129 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6948775fc9-5nvhq" podStartSLOduration=1.282470307 podStartE2EDuration="3.971092575s" podCreationTimestamp="2026-01-15 23:46:37 +0000 UTC" firstStartedPulling="2026-01-15 23:46:37.903530556 +0000 UTC m=+22.103525901" lastFinishedPulling="2026-01-15 23:46:40.592152824 +0000 UTC m=+24.792148169" observedRunningTime="2026-01-15 23:46:40.969803209 +0000 UTC m=+25.169798554" watchObservedRunningTime="2026-01-15 23:46:40.971092575 +0000 UTC m=+25.171087920" Jan 15 23:46:41.014140 kubelet[2882]: E0115 23:46:41.014097 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.014140 kubelet[2882]: W0115 23:46:41.014122 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.014140 kubelet[2882]: E0115 23:46:41.014143 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.014997 kubelet[2882]: E0115 23:46:41.014964 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.014997 kubelet[2882]: W0115 23:46:41.014994 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015069 kubelet[2882]: E0115 23:46:41.015008 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.015220 kubelet[2882]: E0115 23:46:41.015192 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.015220 kubelet[2882]: W0115 23:46:41.015205 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015220 kubelet[2882]: E0115 23:46:41.015213 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.015403 kubelet[2882]: E0115 23:46:41.015378 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.015403 kubelet[2882]: W0115 23:46:41.015388 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015403 kubelet[2882]: E0115 23:46:41.015396 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.015580 kubelet[2882]: E0115 23:46:41.015566 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.015580 kubelet[2882]: W0115 23:46:41.015575 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015623 kubelet[2882]: E0115 23:46:41.015583 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.015765 kubelet[2882]: E0115 23:46:41.015735 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.015765 kubelet[2882]: W0115 23:46:41.015746 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015765 kubelet[2882]: E0115 23:46:41.015754 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.015892 kubelet[2882]: E0115 23:46:41.015880 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.015921 kubelet[2882]: W0115 23:46:41.015898 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.015921 kubelet[2882]: E0115 23:46:41.015907 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016027 kubelet[2882]: E0115 23:46:41.016017 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016027 kubelet[2882]: W0115 23:46:41.016026 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016065 kubelet[2882]: E0115 23:46:41.016034 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016217 kubelet[2882]: E0115 23:46:41.016204 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016217 kubelet[2882]: W0115 23:46:41.016215 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016269 kubelet[2882]: E0115 23:46:41.016224 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016362 kubelet[2882]: E0115 23:46:41.016350 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016362 kubelet[2882]: W0115 23:46:41.016360 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016403 kubelet[2882]: E0115 23:46:41.016367 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016494 kubelet[2882]: E0115 23:46:41.016483 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016518 kubelet[2882]: W0115 23:46:41.016493 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016518 kubelet[2882]: E0115 23:46:41.016509 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016632 kubelet[2882]: E0115 23:46:41.016621 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016632 kubelet[2882]: W0115 23:46:41.016631 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016675 kubelet[2882]: E0115 23:46:41.016639 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016792 kubelet[2882]: E0115 23:46:41.016780 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016821 kubelet[2882]: W0115 23:46:41.016797 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016821 kubelet[2882]: E0115 23:46:41.016805 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.016938 kubelet[2882]: E0115 23:46:41.016925 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.016938 kubelet[2882]: W0115 23:46:41.016935 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.016981 kubelet[2882]: E0115 23:46:41.016949 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.017076 kubelet[2882]: E0115 23:46:41.017065 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.017104 kubelet[2882]: W0115 23:46:41.017075 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.017104 kubelet[2882]: E0115 23:46:41.017083 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.033548 kubelet[2882]: E0115 23:46:41.033519 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.033548 kubelet[2882]: W0115 23:46:41.033540 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.033548 kubelet[2882]: E0115 23:46:41.033559 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.033744 kubelet[2882]: E0115 23:46:41.033731 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.033744 kubelet[2882]: W0115 23:46:41.033741 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.033796 kubelet[2882]: E0115 23:46:41.033749 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.033984 kubelet[2882]: E0115 23:46:41.033963 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.034028 kubelet[2882]: W0115 23:46:41.033983 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.034028 kubelet[2882]: E0115 23:46:41.033998 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.034169 kubelet[2882]: E0115 23:46:41.034159 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.034169 kubelet[2882]: W0115 23:46:41.034168 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.034241 kubelet[2882]: E0115 23:46:41.034176 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.034322 kubelet[2882]: E0115 23:46:41.034311 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.034322 kubelet[2882]: W0115 23:46:41.034321 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.034367 kubelet[2882]: E0115 23:46:41.034328 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.034494 kubelet[2882]: E0115 23:46:41.034484 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.034518 kubelet[2882]: W0115 23:46:41.034494 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.034518 kubelet[2882]: E0115 23:46:41.034505 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035115 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036456 kubelet[2882]: W0115 23:46:41.035137 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035152 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035338 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036456 kubelet[2882]: W0115 23:46:41.035348 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035364 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035527 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036456 kubelet[2882]: W0115 23:46:41.035535 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035543 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036456 kubelet[2882]: E0115 23:46:41.035648 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036717 kubelet[2882]: W0115 23:46:41.035654 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.035662 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.035792 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036717 kubelet[2882]: W0115 23:46:41.035800 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.035808 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.036090 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036717 kubelet[2882]: W0115 23:46:41.036100 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.036115 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036717 kubelet[2882]: E0115 23:46:41.036264 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036717 kubelet[2882]: W0115 23:46:41.036278 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036288 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036411 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036920 kubelet[2882]: W0115 23:46:41.036418 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036426 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036546 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036920 kubelet[2882]: W0115 23:46:41.036555 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036563 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036684 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.036920 kubelet[2882]: W0115 23:46:41.036690 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.036920 kubelet[2882]: E0115 23:46:41.036698 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.037107 kubelet[2882]: E0115 23:46:41.037017 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.037107 kubelet[2882]: W0115 23:46:41.037029 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.037107 kubelet[2882]: E0115 23:46:41.037040 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.038533 kubelet[2882]: E0115 23:46:41.038510 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:41.038533 kubelet[2882]: W0115 23:46:41.038528 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:41.038620 kubelet[2882]: E0115 23:46:41.038543 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:41.873638 containerd[1623]: time="2026-01-15T23:46:41.873574935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:41.878112 containerd[1623]: time="2026-01-15T23:46:41.878051717Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Jan 15 23:46:41.879153 containerd[1623]: time="2026-01-15T23:46:41.879114522Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:41.881207 kubelet[2882]: E0115 23:46:41.881153 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:41.882672 containerd[1623]: time="2026-01-15T23:46:41.882626619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:41.883913 containerd[1623]: time="2026-01-15T23:46:41.883268222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.290930317s" Jan 15 23:46:41.883913 containerd[1623]: time="2026-01-15T23:46:41.883313982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 15 23:46:41.888711 containerd[1623]: time="2026-01-15T23:46:41.888675128Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 23:46:41.900921 containerd[1623]: time="2026-01-15T23:46:41.899729821Z" level=info msg="Container a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:41.912284 containerd[1623]: time="2026-01-15T23:46:41.912226762Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b\"" Jan 15 23:46:41.913916 containerd[1623]: time="2026-01-15T23:46:41.913876890Z" level=info msg="StartContainer for \"a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b\"" Jan 15 23:46:41.916820 containerd[1623]: time="2026-01-15T23:46:41.916782024Z" level=info msg="connecting to shim a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b" address="unix:///run/containerd/s/9e5f056a1874f311b505cde997a4579f24e79e7e5a18db941a008cfa12c21bb4" protocol=ttrpc version=3 Jan 15 23:46:41.940645 systemd[1]: Started cri-containerd-a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b.scope - libcontainer container a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b. Jan 15 23:46:41.961296 kubelet[2882]: I0115 23:46:41.961265 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 23:46:42.013373 containerd[1623]: time="2026-01-15T23:46:42.013317050Z" level=info msg="StartContainer for \"a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b\" returns successfully" Jan 15 23:46:42.024143 kubelet[2882]: E0115 23:46:42.024113 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.024143 kubelet[2882]: W0115 23:46:42.024137 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.024655 kubelet[2882]: E0115 23:46:42.024158 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.024655 kubelet[2882]: E0115 23:46:42.024511 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.024655 kubelet[2882]: W0115 23:46:42.024523 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.024655 kubelet[2882]: E0115 23:46:42.024585 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025064 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.025955 kubelet[2882]: W0115 23:46:42.025079 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025237 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025604 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.025955 kubelet[2882]: W0115 23:46:42.025615 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025633 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025867 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.025955 kubelet[2882]: W0115 23:46:42.025876 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.025955 kubelet[2882]: E0115 23:46:42.025886 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.026560 kubelet[2882]: E0115 23:46:42.026510 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.026560 kubelet[2882]: W0115 23:46:42.026527 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.026560 kubelet[2882]: E0115 23:46:42.026539 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.026875 kubelet[2882]: E0115 23:46:42.026862 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.026875 kubelet[2882]: W0115 23:46:42.026875 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.026966 kubelet[2882]: E0115 23:46:42.026886 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027098 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028141 kubelet[2882]: W0115 23:46:42.027108 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027120 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027520 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028141 kubelet[2882]: W0115 23:46:42.027531 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027542 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027810 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028141 kubelet[2882]: W0115 23:46:42.027821 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028141 kubelet[2882]: E0115 23:46:42.027832 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028360 kubelet[2882]: E0115 23:46:42.028183 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028360 kubelet[2882]: W0115 23:46:42.028195 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028360 kubelet[2882]: E0115 23:46:42.028225 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028417 kubelet[2882]: E0115 23:46:42.028394 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028417 kubelet[2882]: W0115 23:46:42.028404 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028482 kubelet[2882]: E0115 23:46:42.028417 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028609 kubelet[2882]: E0115 23:46:42.028592 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028609 kubelet[2882]: W0115 23:46:42.028604 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028659 kubelet[2882]: E0115 23:46:42.028612 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.028812 kubelet[2882]: E0115 23:46:42.028800 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.028812 kubelet[2882]: W0115 23:46:42.028810 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.028885 kubelet[2882]: E0115 23:46:42.028817 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.029006 kubelet[2882]: E0115 23:46:42.028987 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 23:46:42.029006 kubelet[2882]: W0115 23:46:42.028997 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 23:46:42.029006 kubelet[2882]: E0115 23:46:42.029006 2882 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 23:46:42.032700 systemd[1]: cri-containerd-a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b.scope: Deactivated successfully. Jan 15 23:46:42.038866 containerd[1623]: time="2026-01-15T23:46:42.038159050Z" level=info msg="received container exit event container_id:\"a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b\" id:\"a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b\" pid:3578 exited_at:{seconds:1768520802 nanos:36588042}" Jan 15 23:46:42.072374 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a23f82d7656ff4de0410a6f0cca66713218a17eb101ed36a4234de02abc0140b-rootfs.mount: Deactivated successfully. Jan 15 23:46:43.881019 kubelet[2882]: E0115 23:46:43.880959 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:44.971126 containerd[1623]: time="2026-01-15T23:46:44.971069179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 15 23:46:45.880299 kubelet[2882]: E0115 23:46:45.880197 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:47.320511 containerd[1623]: time="2026-01-15T23:46:47.319849646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:47.321745 containerd[1623]: time="2026-01-15T23:46:47.321709495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Jan 15 23:46:47.323480 containerd[1623]: time="2026-01-15T23:46:47.323454743Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:47.325491 containerd[1623]: time="2026-01-15T23:46:47.325421713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:47.326174 containerd[1623]: time="2026-01-15T23:46:47.326139516Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.355026337s" Jan 15 23:46:47.326174 containerd[1623]: time="2026-01-15T23:46:47.326171756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 15 23:46:47.331518 containerd[1623]: time="2026-01-15T23:46:47.331281661Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 23:46:47.343912 containerd[1623]: time="2026-01-15T23:46:47.343829602Z" level=info msg="Container e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:47.345987 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2267938765.mount: Deactivated successfully. Jan 15 23:46:47.354277 containerd[1623]: time="2026-01-15T23:46:47.354239292Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4\"" Jan 15 23:46:47.355178 containerd[1623]: time="2026-01-15T23:46:47.355126176Z" level=info msg="StartContainer for \"e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4\"" Jan 15 23:46:47.356694 containerd[1623]: time="2026-01-15T23:46:47.356670024Z" level=info msg="connecting to shim e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4" address="unix:///run/containerd/s/9e5f056a1874f311b505cde997a4579f24e79e7e5a18db941a008cfa12c21bb4" protocol=ttrpc version=3 Jan 15 23:46:47.379630 systemd[1]: Started cri-containerd-e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4.scope - libcontainer container e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4. Jan 15 23:46:47.467566 containerd[1623]: time="2026-01-15T23:46:47.467513399Z" level=info msg="StartContainer for \"e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4\" returns successfully" Jan 15 23:46:47.881478 kubelet[2882]: E0115 23:46:47.880971 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:48.723694 containerd[1623]: time="2026-01-15T23:46:48.723652308Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 23:46:48.725571 systemd[1]: cri-containerd-e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4.scope: Deactivated successfully. Jan 15 23:46:48.725844 systemd[1]: cri-containerd-e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4.scope: Consumed 453ms CPU time, 186.9M memory peak, 165.9M written to disk. Jan 15 23:46:48.727299 containerd[1623]: time="2026-01-15T23:46:48.727248845Z" level=info msg="received container exit event container_id:\"e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4\" id:\"e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4\" pid:3653 exited_at:{seconds:1768520808 nanos:727005084}" Jan 15 23:46:48.746476 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e0eb681325e9cad37cf496ec3052343c0852f1485a002772af3e6d2cca0468b4-rootfs.mount: Deactivated successfully. Jan 15 23:46:48.773988 kubelet[2882]: I0115 23:46:48.773944 2882 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 15 23:46:50.066389 systemd[1]: Created slice kubepods-burstable-pod0a185b21_6a30_4d90_a4b5_3899fa71bcad.slice - libcontainer container kubepods-burstable-pod0a185b21_6a30_4d90_a4b5_3899fa71bcad.slice. Jan 15 23:46:50.076376 systemd[1]: Created slice kubepods-besteffort-podff12f75f_0f47_4e03_a069_ef3f612b51b0.slice - libcontainer container kubepods-besteffort-podff12f75f_0f47_4e03_a069_ef3f612b51b0.slice. Jan 15 23:46:50.115848 kubelet[2882]: I0115 23:46:50.115777 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a185b21-6a30-4d90-a4b5-3899fa71bcad-config-volume\") pod \"coredns-674b8bbfcf-zzfg8\" (UID: \"0a185b21-6a30-4d90-a4b5-3899fa71bcad\") " pod="kube-system/coredns-674b8bbfcf-zzfg8" Jan 15 23:46:50.115848 kubelet[2882]: I0115 23:46:50.115820 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ff12f75f-0f47-4e03-a069-ef3f612b51b0-calico-apiserver-certs\") pod \"calico-apiserver-76998f65d4-hpm9k\" (UID: \"ff12f75f-0f47-4e03-a069-ef3f612b51b0\") " pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" Jan 15 23:46:50.115848 kubelet[2882]: I0115 23:46:50.115842 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zpx\" (UniqueName: \"kubernetes.io/projected/ff12f75f-0f47-4e03-a069-ef3f612b51b0-kube-api-access-x4zpx\") pod \"calico-apiserver-76998f65d4-hpm9k\" (UID: \"ff12f75f-0f47-4e03-a069-ef3f612b51b0\") " pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" Jan 15 23:46:50.115848 kubelet[2882]: I0115 23:46:50.115861 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpdd\" (UniqueName: \"kubernetes.io/projected/0a185b21-6a30-4d90-a4b5-3899fa71bcad-kube-api-access-4cpdd\") pod \"coredns-674b8bbfcf-zzfg8\" (UID: \"0a185b21-6a30-4d90-a4b5-3899fa71bcad\") " pod="kube-system/coredns-674b8bbfcf-zzfg8" Jan 15 23:46:50.196336 systemd[1]: Created slice kubepods-besteffort-pod717a4ddd_81e7_49b2_b875_208b71de27d1.slice - libcontainer container kubepods-besteffort-pod717a4ddd_81e7_49b2_b875_208b71de27d1.slice. Jan 15 23:46:50.203196 systemd[1]: Created slice kubepods-besteffort-podfee8d1af_3972_419e_8500_84b3b6b46b71.slice - libcontainer container kubepods-besteffort-podfee8d1af_3972_419e_8500_84b3b6b46b71.slice. Jan 15 23:46:50.207681 containerd[1623]: time="2026-01-15T23:46:50.207619117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zqjh8,Uid:fee8d1af-3972-419e-8500-84b3b6b46b71,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:50.217838 systemd[1]: Created slice kubepods-besteffort-podbd4ec28f_506a_4bc3_a31b_47884dea56d8.slice - libcontainer container kubepods-besteffort-podbd4ec28f_506a_4bc3_a31b_47884dea56d8.slice. Jan 15 23:46:50.231355 systemd[1]: Created slice kubepods-besteffort-podc476e4d7_47c4_4b45_afc2_049d681292b9.slice - libcontainer container kubepods-besteffort-podc476e4d7_47c4_4b45_afc2_049d681292b9.slice. Jan 15 23:46:50.248301 systemd[1]: Created slice kubepods-besteffort-podfc5a246d_dfc3_43e7_b2c7_409da1aecc92.slice - libcontainer container kubepods-besteffort-podfc5a246d_dfc3_43e7_b2c7_409da1aecc92.slice. Jan 15 23:46:50.255527 systemd[1]: Created slice kubepods-burstable-pod9e61b7bf_93dc_4d27_b649_6160a63ab763.slice - libcontainer container kubepods-burstable-pod9e61b7bf_93dc_4d27_b649_6160a63ab763.slice. Jan 15 23:46:50.298153 containerd[1623]: time="2026-01-15T23:46:50.298104474Z" level=error msg="Failed to destroy network for sandbox \"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.300169 containerd[1623]: time="2026-01-15T23:46:50.300105524Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zqjh8,Uid:fee8d1af-3972-419e-8500-84b3b6b46b71,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.300142 systemd[1]: run-netns-cni\x2dda8313aa\x2dced9\x2dfc12\x2d7b00\x2dc5c2cb919147.mount: Deactivated successfully. Jan 15 23:46:50.300608 kubelet[2882]: E0115 23:46:50.300551 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.300681 kubelet[2882]: E0115 23:46:50.300638 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:50.300681 kubelet[2882]: E0115 23:46:50.300661 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zqjh8" Jan 15 23:46:50.300753 kubelet[2882]: E0115 23:46:50.300721 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02b8500ed5c03ccc88c4bb87220e7c0abf7a98b15cd0dde2550c01c6d757a92d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:46:50.317373 kubelet[2882]: I0115 23:46:50.316923 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wcb\" (UniqueName: \"kubernetes.io/projected/9e61b7bf-93dc-4d27-b649-6160a63ab763-kube-api-access-g5wcb\") pod \"coredns-674b8bbfcf-blfkv\" (UID: \"9e61b7bf-93dc-4d27-b649-6160a63ab763\") " pod="kube-system/coredns-674b8bbfcf-blfkv" Jan 15 23:46:50.317373 kubelet[2882]: I0115 23:46:50.316986 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9n5x\" (UniqueName: \"kubernetes.io/projected/c476e4d7-47c4-4b45-afc2-049d681292b9-kube-api-access-k9n5x\") pod \"goldmane-666569f655-t2blj\" (UID: \"c476e4d7-47c4-4b45-afc2-049d681292b9\") " pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.317373 kubelet[2882]: I0115 23:46:50.317028 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/717a4ddd-81e7-49b2-b875-208b71de27d1-calico-apiserver-certs\") pod \"calico-apiserver-76998f65d4-lbk6r\" (UID: \"717a4ddd-81e7-49b2-b875-208b71de27d1\") " pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" Jan 15 23:46:50.317373 kubelet[2882]: I0115 23:46:50.317085 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-backend-key-pair\") pod \"whisker-7fbf6b8887-kvn9m\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " pod="calico-system/whisker-7fbf6b8887-kvn9m" Jan 15 23:46:50.317373 kubelet[2882]: I0115 23:46:50.317105 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4ln9\" (UniqueName: \"kubernetes.io/projected/bd4ec28f-506a-4bc3-a31b-47884dea56d8-kube-api-access-r4ln9\") pod \"whisker-7fbf6b8887-kvn9m\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " pod="calico-system/whisker-7fbf6b8887-kvn9m" Jan 15 23:46:50.317591 kubelet[2882]: I0115 23:46:50.317129 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kbff\" (UniqueName: \"kubernetes.io/projected/fc5a246d-dfc3-43e7-b2c7-409da1aecc92-kube-api-access-4kbff\") pod \"calico-kube-controllers-69897db5bb-bgndc\" (UID: \"fc5a246d-dfc3-43e7-b2c7-409da1aecc92\") " pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" Jan 15 23:46:50.317591 kubelet[2882]: I0115 23:46:50.317145 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c476e4d7-47c4-4b45-afc2-049d681292b9-goldmane-ca-bundle\") pod \"goldmane-666569f655-t2blj\" (UID: \"c476e4d7-47c4-4b45-afc2-049d681292b9\") " pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.317591 kubelet[2882]: I0115 23:46:50.317166 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c476e4d7-47c4-4b45-afc2-049d681292b9-goldmane-key-pair\") pod \"goldmane-666569f655-t2blj\" (UID: \"c476e4d7-47c4-4b45-afc2-049d681292b9\") " pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.317591 kubelet[2882]: I0115 23:46:50.317269 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l7vh\" (UniqueName: \"kubernetes.io/projected/717a4ddd-81e7-49b2-b875-208b71de27d1-kube-api-access-8l7vh\") pod \"calico-apiserver-76998f65d4-lbk6r\" (UID: \"717a4ddd-81e7-49b2-b875-208b71de27d1\") " pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" Jan 15 23:46:50.317591 kubelet[2882]: I0115 23:46:50.317308 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-ca-bundle\") pod \"whisker-7fbf6b8887-kvn9m\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " pod="calico-system/whisker-7fbf6b8887-kvn9m" Jan 15 23:46:50.317709 kubelet[2882]: I0115 23:46:50.317325 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a246d-dfc3-43e7-b2c7-409da1aecc92-tigera-ca-bundle\") pod \"calico-kube-controllers-69897db5bb-bgndc\" (UID: \"fc5a246d-dfc3-43e7-b2c7-409da1aecc92\") " pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" Jan 15 23:46:50.317709 kubelet[2882]: I0115 23:46:50.317341 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c476e4d7-47c4-4b45-afc2-049d681292b9-config\") pod \"goldmane-666569f655-t2blj\" (UID: \"c476e4d7-47c4-4b45-afc2-049d681292b9\") " pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.317709 kubelet[2882]: I0115 23:46:50.317375 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e61b7bf-93dc-4d27-b649-6160a63ab763-config-volume\") pod \"coredns-674b8bbfcf-blfkv\" (UID: \"9e61b7bf-93dc-4d27-b649-6160a63ab763\") " pod="kube-system/coredns-674b8bbfcf-blfkv" Jan 15 23:46:50.372318 containerd[1623]: time="2026-01-15T23:46:50.372267472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zzfg8,Uid:0a185b21-6a30-4d90-a4b5-3899fa71bcad,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:50.380365 containerd[1623]: time="2026-01-15T23:46:50.380067950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-hpm9k,Uid:ff12f75f-0f47-4e03-a069-ef3f612b51b0,Namespace:calico-apiserver,Attempt:0,}" Jan 15 23:46:50.430654 containerd[1623]: time="2026-01-15T23:46:50.430603994Z" level=error msg="Failed to destroy network for sandbox \"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.434458 containerd[1623]: time="2026-01-15T23:46:50.433864050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zzfg8,Uid:0a185b21-6a30-4d90-a4b5-3899fa71bcad,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.435517 kubelet[2882]: E0115 23:46:50.434656 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.435517 kubelet[2882]: E0115 23:46:50.434721 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zzfg8" Jan 15 23:46:50.435517 kubelet[2882]: E0115 23:46:50.434739 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zzfg8" Jan 15 23:46:50.435650 kubelet[2882]: E0115 23:46:50.435419 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zzfg8_kube-system(0a185b21-6a30-4d90-a4b5-3899fa71bcad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zzfg8_kube-system(0a185b21-6a30-4d90-a4b5-3899fa71bcad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2315b6caaf87538ebcd10c7efc1f0e16cdb770e1c8aab92550c49fca65e5e9e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zzfg8" podUID="0a185b21-6a30-4d90-a4b5-3899fa71bcad" Jan 15 23:46:50.446745 containerd[1623]: time="2026-01-15T23:46:50.446695672Z" level=error msg="Failed to destroy network for sandbox \"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.448013 containerd[1623]: time="2026-01-15T23:46:50.447967478Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-hpm9k,Uid:ff12f75f-0f47-4e03-a069-ef3f612b51b0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.448280 kubelet[2882]: E0115 23:46:50.448217 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.448344 kubelet[2882]: E0115 23:46:50.448295 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" Jan 15 23:46:50.448344 kubelet[2882]: E0115 23:46:50.448331 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" Jan 15 23:46:50.448412 kubelet[2882]: E0115 23:46:50.448384 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6898b1e976b77849638839e67274e60f267c68b760fb46c4d9a99e449321ebe4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:46:50.501033 containerd[1623]: time="2026-01-15T23:46:50.500901974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-lbk6r,Uid:717a4ddd-81e7-49b2-b875-208b71de27d1,Namespace:calico-apiserver,Attempt:0,}" Jan 15 23:46:50.522839 containerd[1623]: time="2026-01-15T23:46:50.522800359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7fbf6b8887-kvn9m,Uid:bd4ec28f-506a-4bc3-a31b-47884dea56d8,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:50.548058 containerd[1623]: time="2026-01-15T23:46:50.547730480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-t2blj,Uid:c476e4d7-47c4-4b45-afc2-049d681292b9,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:50.549473 containerd[1623]: time="2026-01-15T23:46:50.549414568Z" level=error msg="Failed to destroy network for sandbox \"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.551638 containerd[1623]: time="2026-01-15T23:46:50.551563498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-lbk6r,Uid:717a4ddd-81e7-49b2-b875-208b71de27d1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.552297 kubelet[2882]: E0115 23:46:50.551841 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.552297 kubelet[2882]: E0115 23:46:50.551898 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" Jan 15 23:46:50.552297 kubelet[2882]: E0115 23:46:50.551917 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" Jan 15 23:46:50.552511 kubelet[2882]: E0115 23:46:50.551968 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15b0328c41a3e7dab1b1e39a49edb6d0ef4aacbe6a84ea219afc21ee1281a38d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:46:50.556264 containerd[1623]: time="2026-01-15T23:46:50.555196716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69897db5bb-bgndc,Uid:fc5a246d-dfc3-43e7-b2c7-409da1aecc92,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:50.558662 containerd[1623]: time="2026-01-15T23:46:50.558627092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-blfkv,Uid:9e61b7bf-93dc-4d27-b649-6160a63ab763,Namespace:kube-system,Attempt:0,}" Jan 15 23:46:50.594833 containerd[1623]: time="2026-01-15T23:46:50.594606746Z" level=error msg="Failed to destroy network for sandbox \"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.597596 containerd[1623]: time="2026-01-15T23:46:50.597540880Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7fbf6b8887-kvn9m,Uid:bd4ec28f-506a-4bc3-a31b-47884dea56d8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.597830 kubelet[2882]: E0115 23:46:50.597774 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.597880 kubelet[2882]: E0115 23:46:50.597828 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7fbf6b8887-kvn9m" Jan 15 23:46:50.597880 kubelet[2882]: E0115 23:46:50.597850 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7fbf6b8887-kvn9m" Jan 15 23:46:50.598041 kubelet[2882]: E0115 23:46:50.597900 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7fbf6b8887-kvn9m_calico-system(bd4ec28f-506a-4bc3-a31b-47884dea56d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7fbf6b8887-kvn9m_calico-system(bd4ec28f-506a-4bc3-a31b-47884dea56d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6e0a41703a9ccc5adef823ae40cdf4e21ba4b3d64890c4afebff430be2ff0c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7fbf6b8887-kvn9m" podUID="bd4ec28f-506a-4bc3-a31b-47884dea56d8" Jan 15 23:46:50.616161 containerd[1623]: time="2026-01-15T23:46:50.616092050Z" level=error msg="Failed to destroy network for sandbox \"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.617661 containerd[1623]: time="2026-01-15T23:46:50.617599417Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-t2blj,Uid:c476e4d7-47c4-4b45-afc2-049d681292b9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.618421 kubelet[2882]: E0115 23:46:50.617856 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.618421 kubelet[2882]: E0115 23:46:50.617913 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.618421 kubelet[2882]: E0115 23:46:50.617933 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-t2blj" Jan 15 23:46:50.618581 kubelet[2882]: E0115 23:46:50.618004 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"55af3a6b9e2b568ed71ab2f7c6d0a829b7ef5bc6950c7436e4aef5889626709b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:46:50.621575 containerd[1623]: time="2026-01-15T23:46:50.621527676Z" level=error msg="Failed to destroy network for sandbox \"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.623157 containerd[1623]: time="2026-01-15T23:46:50.623114204Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69897db5bb-bgndc,Uid:fc5a246d-dfc3-43e7-b2c7-409da1aecc92,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.623579 kubelet[2882]: E0115 23:46:50.623386 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.623579 kubelet[2882]: E0115 23:46:50.623456 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" Jan 15 23:46:50.623579 kubelet[2882]: E0115 23:46:50.623476 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" Jan 15 23:46:50.623710 kubelet[2882]: E0115 23:46:50.623529 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"73902c2b94aa3b739eec1b9c927824d6c7be1ed3a59671d7176e15999515e11f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:46:50.631388 containerd[1623]: time="2026-01-15T23:46:50.631345524Z" level=error msg="Failed to destroy network for sandbox \"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.633028 containerd[1623]: time="2026-01-15T23:46:50.632991572Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-blfkv,Uid:9e61b7bf-93dc-4d27-b649-6160a63ab763,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.633450 kubelet[2882]: E0115 23:46:50.633364 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 23:46:50.633450 kubelet[2882]: E0115 23:46:50.633418 2882 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-blfkv" Jan 15 23:46:50.633588 kubelet[2882]: E0115 23:46:50.633557 2882 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-blfkv" Jan 15 23:46:50.633714 kubelet[2882]: E0115 23:46:50.633690 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-blfkv_kube-system(9e61b7bf-93dc-4d27-b649-6160a63ab763)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-blfkv_kube-system(9e61b7bf-93dc-4d27-b649-6160a63ab763)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"250452c1cbd2685cfeac5005851686362359f6afdb444e26179b0b0f9e1986a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-blfkv" podUID="9e61b7bf-93dc-4d27-b649-6160a63ab763" Jan 15 23:46:50.988999 containerd[1623]: time="2026-01-15T23:46:50.988892411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 15 23:46:55.324375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount619598515.mount: Deactivated successfully. Jan 15 23:46:55.352472 containerd[1623]: time="2026-01-15T23:46:55.352326011Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:55.353923 containerd[1623]: time="2026-01-15T23:46:55.353892578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Jan 15 23:46:55.355147 containerd[1623]: time="2026-01-15T23:46:55.355112864Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:55.357256 containerd[1623]: time="2026-01-15T23:46:55.357219874Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 23:46:55.358325 containerd[1623]: time="2026-01-15T23:46:55.358285600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.369333629s" Jan 15 23:46:55.358364 containerd[1623]: time="2026-01-15T23:46:55.358326240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 15 23:46:55.369716 containerd[1623]: time="2026-01-15T23:46:55.369675735Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 23:46:55.382981 containerd[1623]: time="2026-01-15T23:46:55.381781233Z" level=info msg="Container 321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:46:55.392194 containerd[1623]: time="2026-01-15T23:46:55.392144963Z" level=info msg="CreateContainer within sandbox \"9f9addd8390571d8bca1238ddfa2f84bd3c7124d584b58425332d771cac02ebf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48\"" Jan 15 23:46:55.393323 containerd[1623]: time="2026-01-15T23:46:55.393290689Z" level=info msg="StartContainer for \"321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48\"" Jan 15 23:46:55.395002 containerd[1623]: time="2026-01-15T23:46:55.394972617Z" level=info msg="connecting to shim 321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48" address="unix:///run/containerd/s/9e5f056a1874f311b505cde997a4579f24e79e7e5a18db941a008cfa12c21bb4" protocol=ttrpc version=3 Jan 15 23:46:55.412678 systemd[1]: Started cri-containerd-321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48.scope - libcontainer container 321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48. Jan 15 23:46:55.493056 containerd[1623]: time="2026-01-15T23:46:55.493009250Z" level=info msg="StartContainer for \"321b2471cffe7e10d0d65d274620e049295ac014edc47a121fd3596959f27a48\" returns successfully" Jan 15 23:46:55.628526 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 15 23:46:55.628711 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 15 23:46:55.853323 kubelet[2882]: I0115 23:46:55.852884 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4ln9\" (UniqueName: \"kubernetes.io/projected/bd4ec28f-506a-4bc3-a31b-47884dea56d8-kube-api-access-r4ln9\") pod \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " Jan 15 23:46:55.853323 kubelet[2882]: I0115 23:46:55.852943 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-ca-bundle\") pod \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " Jan 15 23:46:55.853323 kubelet[2882]: I0115 23:46:55.852966 2882 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-backend-key-pair\") pod \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\" (UID: \"bd4ec28f-506a-4bc3-a31b-47884dea56d8\") " Jan 15 23:46:55.853710 kubelet[2882]: I0115 23:46:55.853331 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bd4ec28f-506a-4bc3-a31b-47884dea56d8" (UID: "bd4ec28f-506a-4bc3-a31b-47884dea56d8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 15 23:46:55.855444 kubelet[2882]: I0115 23:46:55.855387 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bd4ec28f-506a-4bc3-a31b-47884dea56d8" (UID: "bd4ec28f-506a-4bc3-a31b-47884dea56d8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 15 23:46:55.856315 kubelet[2882]: I0115 23:46:55.856277 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4ec28f-506a-4bc3-a31b-47884dea56d8-kube-api-access-r4ln9" (OuterVolumeSpecName: "kube-api-access-r4ln9") pod "bd4ec28f-506a-4bc3-a31b-47884dea56d8" (UID: "bd4ec28f-506a-4bc3-a31b-47884dea56d8"). InnerVolumeSpecName "kube-api-access-r4ln9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 15 23:46:55.888183 systemd[1]: Removed slice kubepods-besteffort-podbd4ec28f_506a_4bc3_a31b_47884dea56d8.slice - libcontainer container kubepods-besteffort-podbd4ec28f_506a_4bc3_a31b_47884dea56d8.slice. Jan 15 23:46:55.953740 kubelet[2882]: I0115 23:46:55.953663 2882 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4ln9\" (UniqueName: \"kubernetes.io/projected/bd4ec28f-506a-4bc3-a31b-47884dea56d8-kube-api-access-r4ln9\") on node \"ci-4459-2-2-n-b7ec270451\" DevicePath \"\"" Jan 15 23:46:55.953935 kubelet[2882]: I0115 23:46:55.953900 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-ca-bundle\") on node \"ci-4459-2-2-n-b7ec270451\" DevicePath \"\"" Jan 15 23:46:55.953935 kubelet[2882]: I0115 23:46:55.953919 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bd4ec28f-506a-4bc3-a31b-47884dea56d8-whisker-backend-key-pair\") on node \"ci-4459-2-2-n-b7ec270451\" DevicePath \"\"" Jan 15 23:46:56.043730 kubelet[2882]: I0115 23:46:56.043661 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-zzvbk" podStartSLOduration=1.729849748 podStartE2EDuration="19.043647431s" podCreationTimestamp="2026-01-15 23:46:37 +0000 UTC" firstStartedPulling="2026-01-15 23:46:38.04510736 +0000 UTC m=+22.245102705" lastFinishedPulling="2026-01-15 23:46:55.358905043 +0000 UTC m=+39.558900388" observedRunningTime="2026-01-15 23:46:56.024671259 +0000 UTC m=+40.224666644" watchObservedRunningTime="2026-01-15 23:46:56.043647431 +0000 UTC m=+40.243642776" Jan 15 23:46:56.105570 systemd[1]: Created slice kubepods-besteffort-pod7d889576_5f17_49b8_be14_3f0e7bea06cf.slice - libcontainer container kubepods-besteffort-pod7d889576_5f17_49b8_be14_3f0e7bea06cf.slice. Jan 15 23:46:56.155341 kubelet[2882]: I0115 23:46:56.155287 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dslx\" (UniqueName: \"kubernetes.io/projected/7d889576-5f17-49b8-be14-3f0e7bea06cf-kube-api-access-4dslx\") pod \"whisker-6d7d4c4d95-4t7sh\" (UID: \"7d889576-5f17-49b8-be14-3f0e7bea06cf\") " pod="calico-system/whisker-6d7d4c4d95-4t7sh" Jan 15 23:46:56.155568 kubelet[2882]: I0115 23:46:56.155543 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7d889576-5f17-49b8-be14-3f0e7bea06cf-whisker-backend-key-pair\") pod \"whisker-6d7d4c4d95-4t7sh\" (UID: \"7d889576-5f17-49b8-be14-3f0e7bea06cf\") " pod="calico-system/whisker-6d7d4c4d95-4t7sh" Jan 15 23:46:56.155662 kubelet[2882]: I0115 23:46:56.155650 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d889576-5f17-49b8-be14-3f0e7bea06cf-whisker-ca-bundle\") pod \"whisker-6d7d4c4d95-4t7sh\" (UID: \"7d889576-5f17-49b8-be14-3f0e7bea06cf\") " pod="calico-system/whisker-6d7d4c4d95-4t7sh" Jan 15 23:46:56.327100 systemd[1]: var-lib-kubelet-pods-bd4ec28f\x2d506a\x2d4bc3\x2da31b\x2d47884dea56d8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr4ln9.mount: Deactivated successfully. Jan 15 23:46:56.327190 systemd[1]: var-lib-kubelet-pods-bd4ec28f\x2d506a\x2d4bc3\x2da31b\x2d47884dea56d8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 15 23:46:56.408513 containerd[1623]: time="2026-01-15T23:46:56.408382033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d7d4c4d95-4t7sh,Uid:7d889576-5f17-49b8-be14-3f0e7bea06cf,Namespace:calico-system,Attempt:0,}" Jan 15 23:46:56.553956 systemd-networkd[1522]: calid991399928d: Link UP Jan 15 23:46:56.554512 systemd-networkd[1522]: calid991399928d: Gained carrier Jan 15 23:46:56.570145 containerd[1623]: 2026-01-15 23:46:56.429 [INFO][4037] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 23:46:56.570145 containerd[1623]: 2026-01-15 23:46:56.449 [INFO][4037] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0 whisker-6d7d4c4d95- calico-system 7d889576-5f17-49b8-be14-3f0e7bea06cf 893 0 2026-01-15 23:46:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6d7d4c4d95 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 whisker-6d7d4c4d95-4t7sh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid991399928d [] [] }} ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-" Jan 15 23:46:56.570145 containerd[1623]: 2026-01-15 23:46:56.449 [INFO][4037] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570145 containerd[1623]: 2026-01-15 23:46:56.493 [INFO][4051] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" HandleID="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Workload="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.494 [INFO][4051] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" HandleID="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Workload="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137760), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"whisker-6d7d4c4d95-4t7sh", "timestamp":"2026-01-15 23:46:56.493242243 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.494 [INFO][4051] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.494 [INFO][4051] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.494 [INFO][4051] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.504 [INFO][4051] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.510 [INFO][4051] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.515 [INFO][4051] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.517 [INFO][4051] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570353 containerd[1623]: 2026-01-15 23:46:56.520 [INFO][4051] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.520 [INFO][4051] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.522 [INFO][4051] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796 Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.525 [INFO][4051] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.532 [INFO][4051] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.65/26] block=192.168.85.64/26 handle="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.532 [INFO][4051] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.65/26] handle="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.532 [INFO][4051] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:46:56.570567 containerd[1623]: 2026-01-15 23:46:56.532 [INFO][4051] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.65/26] IPv6=[] ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" HandleID="k8s-pod-network.2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Workload="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570739 containerd[1623]: 2026-01-15 23:46:56.534 [INFO][4037] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0", GenerateName:"whisker-6d7d4c4d95-", Namespace:"calico-system", SelfLink:"", UID:"7d889576-5f17-49b8-be14-3f0e7bea06cf", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d7d4c4d95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"whisker-6d7d4c4d95-4t7sh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.85.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid991399928d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:46:56.570739 containerd[1623]: 2026-01-15 23:46:56.535 [INFO][4037] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.65/32] ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570810 containerd[1623]: 2026-01-15 23:46:56.535 [INFO][4037] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid991399928d ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570810 containerd[1623]: 2026-01-15 23:46:56.555 [INFO][4037] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.570847 containerd[1623]: 2026-01-15 23:46:56.555 [INFO][4037] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0", GenerateName:"whisker-6d7d4c4d95-", Namespace:"calico-system", SelfLink:"", UID:"7d889576-5f17-49b8-be14-3f0e7bea06cf", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d7d4c4d95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796", Pod:"whisker-6d7d4c4d95-4t7sh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.85.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid991399928d", MAC:"aa:a0:07:23:a7:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:46:56.570894 containerd[1623]: 2026-01-15 23:46:56.567 [INFO][4037] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" Namespace="calico-system" Pod="whisker-6d7d4c4d95-4t7sh" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-whisker--6d7d4c4d95--4t7sh-eth0" Jan 15 23:46:56.591743 containerd[1623]: time="2026-01-15T23:46:56.591684478Z" level=info msg="connecting to shim 2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796" address="unix:///run/containerd/s/d921fb2c62b823bd301c26e9e88cba760ffbcd88f13000047bff24efc07293c8" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:46:56.610664 systemd[1]: Started cri-containerd-2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796.scope - libcontainer container 2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796. Jan 15 23:46:56.639370 containerd[1623]: time="2026-01-15T23:46:56.639305708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d7d4c4d95-4t7sh,Uid:7d889576-5f17-49b8-be14-3f0e7bea06cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"2aa56b42eb49b11567384507d8ce659c3c88f007885dd2ba0b55e8785f297796\"" Jan 15 23:46:56.641117 containerd[1623]: time="2026-01-15T23:46:56.640918316Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 23:46:56.965377 containerd[1623]: time="2026-01-15T23:46:56.965332283Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:46:56.966675 containerd[1623]: time="2026-01-15T23:46:56.966505369Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 23:46:56.966675 containerd[1623]: time="2026-01-15T23:46:56.966570489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 15 23:46:56.967005 kubelet[2882]: E0115 23:46:56.966907 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:46:56.967531 kubelet[2882]: E0115 23:46:56.967285 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:46:56.967607 kubelet[2882]: E0115 23:46:56.967454 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:3d31bd98c0c242218cb5129b8a984197,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 23:46:56.969348 containerd[1623]: time="2026-01-15T23:46:56.969313662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 23:46:57.290206 containerd[1623]: time="2026-01-15T23:46:57.290088452Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:46:57.291751 containerd[1623]: time="2026-01-15T23:46:57.291701860Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 23:46:57.291821 containerd[1623]: time="2026-01-15T23:46:57.291728020Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 15 23:46:57.291997 kubelet[2882]: E0115 23:46:57.291954 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:46:57.292044 kubelet[2882]: E0115 23:46:57.292009 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:46:57.292161 kubelet[2882]: E0115 23:46:57.292126 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 23:46:57.293384 kubelet[2882]: E0115 23:46:57.293328 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:46:57.703806 systemd-networkd[1522]: calid991399928d: Gained IPv6LL Jan 15 23:46:57.882557 kubelet[2882]: I0115 23:46:57.882512 2882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4ec28f-506a-4bc3-a31b-47884dea56d8" path="/var/lib/kubelet/pods/bd4ec28f-506a-4bc3-a31b-47884dea56d8/volumes" Jan 15 23:46:58.008716 kubelet[2882]: E0115 23:46:58.008665 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:47:00.881264 containerd[1623]: time="2026-01-15T23:47:00.881220761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-hpm9k,Uid:ff12f75f-0f47-4e03-a069-ef3f612b51b0,Namespace:calico-apiserver,Attempt:0,}" Jan 15 23:47:00.990217 systemd-networkd[1522]: cali7101eaa27cc: Link UP Jan 15 23:47:00.990532 systemd-networkd[1522]: cali7101eaa27cc: Gained carrier Jan 15 23:47:01.004681 containerd[1623]: 2026-01-15 23:47:00.906 [INFO][4292] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 23:47:01.004681 containerd[1623]: 2026-01-15 23:47:00.921 [INFO][4292] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0 calico-apiserver-76998f65d4- calico-apiserver ff12f75f-0f47-4e03-a069-ef3f612b51b0 820 0 2026-01-15 23:46:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76998f65d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 calico-apiserver-76998f65d4-hpm9k eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7101eaa27cc [] [] }} ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-" Jan 15 23:47:01.004681 containerd[1623]: 2026-01-15 23:47:00.921 [INFO][4292] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.004681 containerd[1623]: 2026-01-15 23:47:00.943 [INFO][4307] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" HandleID="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.944 [INFO][4307] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" HandleID="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004db00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-b7ec270451", "pod":"calico-apiserver-76998f65d4-hpm9k", "timestamp":"2026-01-15 23:47:00.943906344 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.944 [INFO][4307] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.944 [INFO][4307] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.944 [INFO][4307] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.955 [INFO][4307] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.960 [INFO][4307] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.965 [INFO][4307] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.967 [INFO][4307] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.004927 containerd[1623]: 2026-01-15 23:47:00.970 [INFO][4307] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.970 [INFO][4307] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.971 [INFO][4307] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0 Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.980 [INFO][4307] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.985 [INFO][4307] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.66/26] block=192.168.85.64/26 handle="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.985 [INFO][4307] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.66/26] handle="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.986 [INFO][4307] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:01.005115 containerd[1623]: 2026-01-15 23:47:00.986 [INFO][4307] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.66/26] IPv6=[] ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" HandleID="k8s-pod-network.ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.005244 containerd[1623]: 2026-01-15 23:47:00.987 [INFO][4292] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0", GenerateName:"calico-apiserver-76998f65d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff12f75f-0f47-4e03-a069-ef3f612b51b0", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76998f65d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"calico-apiserver-76998f65d4-hpm9k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7101eaa27cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:01.005294 containerd[1623]: 2026-01-15 23:47:00.988 [INFO][4292] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.66/32] ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.005294 containerd[1623]: 2026-01-15 23:47:00.988 [INFO][4292] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7101eaa27cc ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.005294 containerd[1623]: 2026-01-15 23:47:00.990 [INFO][4292] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.005352 containerd[1623]: 2026-01-15 23:47:00.991 [INFO][4292] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0", GenerateName:"calico-apiserver-76998f65d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff12f75f-0f47-4e03-a069-ef3f612b51b0", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76998f65d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0", Pod:"calico-apiserver-76998f65d4-hpm9k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7101eaa27cc", MAC:"3a:e4:a4:a8:57:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:01.005399 containerd[1623]: 2026-01-15 23:47:01.003 [INFO][4292] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-hpm9k" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--hpm9k-eth0" Jan 15 23:47:01.027205 containerd[1623]: time="2026-01-15T23:47:01.027117466Z" level=info msg="connecting to shim ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0" address="unix:///run/containerd/s/346dc0439bf344ef278aa2a8965a2456b764e7ac2181d3fd12c981c9ba76632b" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:01.058649 systemd[1]: Started cri-containerd-ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0.scope - libcontainer container ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0. Jan 15 23:47:01.090572 containerd[1623]: time="2026-01-15T23:47:01.090537932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-hpm9k,Uid:ff12f75f-0f47-4e03-a069-ef3f612b51b0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ec2c7305ecb83112d8616eb18a6f85d63452e82f36ff84a8a06abe3a2f9e16f0\"" Jan 15 23:47:01.091987 containerd[1623]: time="2026-01-15T23:47:01.091846538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:01.421021 containerd[1623]: time="2026-01-15T23:47:01.420961208Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:01.422465 containerd[1623]: time="2026-01-15T23:47:01.422409015Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:01.422506 containerd[1623]: time="2026-01-15T23:47:01.422455656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:01.422849 kubelet[2882]: E0115 23:47:01.422780 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:01.423578 kubelet[2882]: E0115 23:47:01.423311 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:01.423578 kubelet[2882]: E0115 23:47:01.423526 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4zpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:01.424718 kubelet[2882]: E0115 23:47:01.424682 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:01.880919 containerd[1623]: time="2026-01-15T23:47:01.880872470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-blfkv,Uid:9e61b7bf-93dc-4d27-b649-6160a63ab763,Namespace:kube-system,Attempt:0,}" Jan 15 23:47:01.983379 systemd-networkd[1522]: calic497e1ebb01: Link UP Jan 15 23:47:01.984049 systemd-networkd[1522]: calic497e1ebb01: Gained carrier Jan 15 23:47:01.996497 containerd[1623]: 2026-01-15 23:47:01.903 [INFO][4389] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 23:47:01.996497 containerd[1623]: 2026-01-15 23:47:01.917 [INFO][4389] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0 coredns-674b8bbfcf- kube-system 9e61b7bf-93dc-4d27-b649-6160a63ab763 822 0 2026-01-15 23:46:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 coredns-674b8bbfcf-blfkv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic497e1ebb01 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-" Jan 15 23:47:01.996497 containerd[1623]: 2026-01-15 23:47:01.917 [INFO][4389] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.996497 containerd[1623]: 2026-01-15 23:47:01.939 [INFO][4405] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" HandleID="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.939 [INFO][4405] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" HandleID="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136da0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"coredns-674b8bbfcf-blfkv", "timestamp":"2026-01-15 23:47:01.939412313 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.939 [INFO][4405] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.939 [INFO][4405] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.939 [INFO][4405] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.949 [INFO][4405] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.955 [INFO][4405] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.960 [INFO][4405] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.962 [INFO][4405] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.996923 containerd[1623]: 2026-01-15 23:47:01.965 [INFO][4405] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.965 [INFO][4405] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.967 [INFO][4405] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8 Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.970 [INFO][4405] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.979 [INFO][4405] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.67/26] block=192.168.85.64/26 handle="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.979 [INFO][4405] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.67/26] handle="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.979 [INFO][4405] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:01.997138 containerd[1623]: 2026-01-15 23:47:01.979 [INFO][4405] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.67/26] IPv6=[] ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" HandleID="k8s-pod-network.67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.981 [INFO][4389] cni-plugin/k8s.go 418: Populated endpoint ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e61b7bf-93dc-4d27-b649-6160a63ab763", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"coredns-674b8bbfcf-blfkv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic497e1ebb01", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.981 [INFO][4389] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.67/32] ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.981 [INFO][4389] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic497e1ebb01 ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.984 [INFO][4389] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.984 [INFO][4389] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e61b7bf-93dc-4d27-b649-6160a63ab763", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8", Pod:"coredns-674b8bbfcf-blfkv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic497e1ebb01", MAC:"46:d5:23:c0:1c:bf", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:01.997278 containerd[1623]: 2026-01-15 23:47:01.994 [INFO][4389] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" Namespace="kube-system" Pod="coredns-674b8bbfcf-blfkv" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--blfkv-eth0" Jan 15 23:47:02.021093 kubelet[2882]: E0115 23:47:02.021049 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:02.024221 containerd[1623]: time="2026-01-15T23:47:02.024177643Z" level=info msg="connecting to shim 67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8" address="unix:///run/containerd/s/442b3ac4adb24580579115de899d0783b69f21c3fb6dcd03ee8020c4b15b90d1" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:02.044648 systemd[1]: Started cri-containerd-67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8.scope - libcontainer container 67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8. Jan 15 23:47:02.062371 kubelet[2882]: I0115 23:47:02.062130 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 23:47:02.094084 containerd[1623]: time="2026-01-15T23:47:02.094025620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-blfkv,Uid:9e61b7bf-93dc-4d27-b649-6160a63ab763,Namespace:kube-system,Attempt:0,} returns sandbox id \"67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8\"" Jan 15 23:47:02.099973 containerd[1623]: time="2026-01-15T23:47:02.099939409Z" level=info msg="CreateContainer within sandbox \"67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 23:47:02.111798 containerd[1623]: time="2026-01-15T23:47:02.111754746Z" level=info msg="Container c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:47:02.114338 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1800380907.mount: Deactivated successfully. Jan 15 23:47:02.118579 containerd[1623]: time="2026-01-15T23:47:02.118544018Z" level=info msg="CreateContainer within sandbox \"67a4b97e4da0f7597b98ade196be2e4b3b436a3f5a0fbe86592e35d6cbeeefc8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506\"" Jan 15 23:47:02.119378 containerd[1623]: time="2026-01-15T23:47:02.119342102Z" level=info msg="StartContainer for \"c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506\"" Jan 15 23:47:02.120289 containerd[1623]: time="2026-01-15T23:47:02.120245187Z" level=info msg="connecting to shim c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506" address="unix:///run/containerd/s/442b3ac4adb24580579115de899d0783b69f21c3fb6dcd03ee8020c4b15b90d1" protocol=ttrpc version=3 Jan 15 23:47:02.140660 systemd[1]: Started cri-containerd-c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506.scope - libcontainer container c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506. Jan 15 23:47:02.166495 containerd[1623]: time="2026-01-15T23:47:02.166454290Z" level=info msg="StartContainer for \"c9018931bb034230afba942ea8008798cbbc407e036c362a4484ad4567208506\" returns successfully" Jan 15 23:47:02.505219 systemd-networkd[1522]: vxlan.calico: Link UP Jan 15 23:47:02.505260 systemd-networkd[1522]: vxlan.calico: Gained carrier Jan 15 23:47:02.759801 systemd-networkd[1522]: cali7101eaa27cc: Gained IPv6LL Jan 15 23:47:02.881928 containerd[1623]: time="2026-01-15T23:47:02.881877746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zqjh8,Uid:fee8d1af-3972-419e-8500-84b3b6b46b71,Namespace:calico-system,Attempt:0,}" Jan 15 23:47:02.986497 systemd-networkd[1522]: calica8538e3c88: Link UP Jan 15 23:47:02.987222 systemd-networkd[1522]: calica8538e3c88: Gained carrier Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.919 [INFO][4640] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0 csi-node-driver- calico-system fee8d1af-3972-419e-8500-84b3b6b46b71 707 0 2026-01-15 23:46:37 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 csi-node-driver-zqjh8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calica8538e3c88 [] [] }} ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.919 [INFO][4640] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.942 [INFO][4655] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" HandleID="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Workload="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.943 [INFO][4655] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" HandleID="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Workload="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001376c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"csi-node-driver-zqjh8", "timestamp":"2026-01-15 23:47:02.942890321 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.943 [INFO][4655] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.943 [INFO][4655] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.943 [INFO][4655] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.953 [INFO][4655] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.958 [INFO][4655] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.962 [INFO][4655] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.965 [INFO][4655] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.968 [INFO][4655] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.968 [INFO][4655] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.970 [INFO][4655] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.975 [INFO][4655] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.982 [INFO][4655] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.68/26] block=192.168.85.64/26 handle="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.982 [INFO][4655] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.68/26] handle="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.982 [INFO][4655] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:03.006939 containerd[1623]: 2026-01-15 23:47:02.982 [INFO][4655] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.68/26] IPv6=[] ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" HandleID="k8s-pod-network.4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Workload="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:02.984 [INFO][4640] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fee8d1af-3972-419e-8500-84b3b6b46b71", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"csi-node-driver-zqjh8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calica8538e3c88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:02.984 [INFO][4640] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.68/32] ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:02.984 [INFO][4640] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica8538e3c88 ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:02.987 [INFO][4640] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:02.988 [INFO][4640] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fee8d1af-3972-419e-8500-84b3b6b46b71", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b", Pod:"csi-node-driver-zqjh8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calica8538e3c88", MAC:"c6:2d:21:95:c4:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:03.007831 containerd[1623]: 2026-01-15 23:47:03.004 [INFO][4640] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" Namespace="calico-system" Pod="csi-node-driver-zqjh8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-csi--node--driver--zqjh8-eth0" Jan 15 23:47:03.025523 kubelet[2882]: E0115 23:47:03.025327 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:03.038992 containerd[1623]: time="2026-01-15T23:47:03.038948345Z" level=info msg="connecting to shim 4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b" address="unix:///run/containerd/s/bf4bea90909fc7198c0ae1c074ed2b5a11cca491c60166381b875326f2d8f70f" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:03.043427 kubelet[2882]: I0115 23:47:03.043349 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-blfkv" podStartSLOduration=41.043319806 podStartE2EDuration="41.043319806s" podCreationTimestamp="2026-01-15 23:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:47:03.043223326 +0000 UTC m=+47.243218671" watchObservedRunningTime="2026-01-15 23:47:03.043319806 +0000 UTC m=+47.243315151" Jan 15 23:47:03.069659 systemd[1]: Started cri-containerd-4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b.scope - libcontainer container 4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b. Jan 15 23:47:03.103778 containerd[1623]: time="2026-01-15T23:47:03.103737378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zqjh8,Uid:fee8d1af-3972-419e-8500-84b3b6b46b71,Namespace:calico-system,Attempt:0,} returns sandbox id \"4216eeaf3f3470d6475ed0f74112c06e243d48e5a168748766165a855c182c8b\"" Jan 15 23:47:03.105430 containerd[1623]: time="2026-01-15T23:47:03.105384306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 23:47:03.443846 containerd[1623]: time="2026-01-15T23:47:03.443659820Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:03.445369 containerd[1623]: time="2026-01-15T23:47:03.445312948Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 23:47:03.445472 containerd[1623]: time="2026-01-15T23:47:03.445389468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 15 23:47:03.445650 kubelet[2882]: E0115 23:47:03.445593 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:03.445650 kubelet[2882]: E0115 23:47:03.445643 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:03.445880 kubelet[2882]: E0115 23:47:03.445844 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:03.448179 containerd[1623]: time="2026-01-15T23:47:03.447976881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 23:47:03.528658 systemd-networkd[1522]: calic497e1ebb01: Gained IPv6LL Jan 15 23:47:03.785787 containerd[1623]: time="2026-01-15T23:47:03.785728193Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:03.787458 containerd[1623]: time="2026-01-15T23:47:03.787398881Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 23:47:03.787458 containerd[1623]: time="2026-01-15T23:47:03.787502361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 15 23:47:03.787910 kubelet[2882]: E0115 23:47:03.787755 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:03.788027 kubelet[2882]: E0115 23:47:03.787860 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:03.788233 kubelet[2882]: E0115 23:47:03.788158 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:03.789391 kubelet[2882]: E0115 23:47:03.789343 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:04.033219 kubelet[2882]: E0115 23:47:04.033142 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:04.040022 systemd-networkd[1522]: vxlan.calico: Gained IPv6LL Jan 15 23:47:04.295934 systemd-networkd[1522]: calica8538e3c88: Gained IPv6LL Jan 15 23:47:05.032925 kubelet[2882]: E0115 23:47:05.032760 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:05.881326 containerd[1623]: time="2026-01-15T23:47:05.880855354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-lbk6r,Uid:717a4ddd-81e7-49b2-b875-208b71de27d1,Namespace:calico-apiserver,Attempt:0,}" Jan 15 23:47:05.881326 containerd[1623]: time="2026-01-15T23:47:05.880862034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zzfg8,Uid:0a185b21-6a30-4d90-a4b5-3899fa71bcad,Namespace:kube-system,Attempt:0,}" Jan 15 23:47:05.881692 containerd[1623]: time="2026-01-15T23:47:05.881532557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69897db5bb-bgndc,Uid:fc5a246d-dfc3-43e7-b2c7-409da1aecc92,Namespace:calico-system,Attempt:0,}" Jan 15 23:47:05.881812 containerd[1623]: time="2026-01-15T23:47:05.881769039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-t2blj,Uid:c476e4d7-47c4-4b45-afc2-049d681292b9,Namespace:calico-system,Attempt:0,}" Jan 15 23:47:06.075218 systemd-networkd[1522]: cali90822a50ead: Link UP Jan 15 23:47:06.076335 systemd-networkd[1522]: cali90822a50ead: Gained carrier Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:05.976 [INFO][4736] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0 coredns-674b8bbfcf- kube-system 0a185b21-6a30-4d90-a4b5-3899fa71bcad 819 0 2026-01-15 23:46:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 coredns-674b8bbfcf-zzfg8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali90822a50ead [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:05.976 [INFO][4736] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.018 [INFO][4790] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" HandleID="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.018 [INFO][4790] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" HandleID="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c32e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"coredns-674b8bbfcf-zzfg8", "timestamp":"2026-01-15 23:47:06.018181658 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.018 [INFO][4790] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.018 [INFO][4790] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.018 [INFO][4790] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.033 [INFO][4790] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.040 [INFO][4790] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.046 [INFO][4790] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.049 [INFO][4790] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.053 [INFO][4790] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.053 [INFO][4790] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.055 [INFO][4790] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.060 [INFO][4790] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4790] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.69/26] block=192.168.85.64/26 handle="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4790] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.69/26] handle="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4790] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:06.093520 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4790] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.69/26] IPv6=[] ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" HandleID="k8s-pod-network.2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Workload="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.072 [INFO][4736] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0a185b21-6a30-4d90-a4b5-3899fa71bcad", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"coredns-674b8bbfcf-zzfg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali90822a50ead", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.072 [INFO][4736] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.69/32] ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.072 [INFO][4736] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90822a50ead ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.075 [INFO][4736] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.076 [INFO][4736] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0a185b21-6a30-4d90-a4b5-3899fa71bcad", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e", Pod:"coredns-674b8bbfcf-zzfg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali90822a50ead", MAC:"da:0e:4b:c1:9a:73", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.094023 containerd[1623]: 2026-01-15 23:47:06.091 [INFO][4736] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" Namespace="kube-system" Pod="coredns-674b8bbfcf-zzfg8" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-coredns--674b8bbfcf--zzfg8-eth0" Jan 15 23:47:06.118972 containerd[1623]: time="2026-01-15T23:47:06.118910504Z" level=info msg="connecting to shim 2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e" address="unix:///run/containerd/s/4916612ef30a44a5af211380a1f45e771de165c4181e54e435b8add452e031d2" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:06.142590 systemd[1]: Started cri-containerd-2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e.scope - libcontainer container 2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e. Jan 15 23:47:06.174413 systemd-networkd[1522]: calib448b98596f: Link UP Jan 15 23:47:06.175464 systemd-networkd[1522]: calib448b98596f: Gained carrier Jan 15 23:47:06.183346 containerd[1623]: time="2026-01-15T23:47:06.183285095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zzfg8,Uid:0a185b21-6a30-4d90-a4b5-3899fa71bcad,Namespace:kube-system,Attempt:0,} returns sandbox id \"2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e\"" Jan 15 23:47:06.191712 containerd[1623]: time="2026-01-15T23:47:06.191045453Z" level=info msg="CreateContainer within sandbox \"2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:05.986 [INFO][4749] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0 calico-kube-controllers-69897db5bb- calico-system fc5a246d-dfc3-43e7-b2c7-409da1aecc92 824 0 2026-01-15 23:46:37 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69897db5bb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 calico-kube-controllers-69897db5bb-bgndc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib448b98596f [] [] }} ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:05.986 [INFO][4749] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.039 [INFO][4798] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" HandleID="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.039 [INFO][4798] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" HandleID="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000354fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"calico-kube-controllers-69897db5bb-bgndc", "timestamp":"2026-01-15 23:47:06.03939744 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.041 [INFO][4798] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4798] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.067 [INFO][4798] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.134 [INFO][4798] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.141 [INFO][4798] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.149 [INFO][4798] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.151 [INFO][4798] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.153 [INFO][4798] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.153 [INFO][4798] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.156 [INFO][4798] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7 Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.162 [INFO][4798] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.168 [INFO][4798] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.70/26] block=192.168.85.64/26 handle="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.169 [INFO][4798] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.70/26] handle="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.169 [INFO][4798] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:06.196536 containerd[1623]: 2026-01-15 23:47:06.169 [INFO][4798] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.70/26] IPv6=[] ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" HandleID="k8s-pod-network.8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.171 [INFO][4749] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0", GenerateName:"calico-kube-controllers-69897db5bb-", Namespace:"calico-system", SelfLink:"", UID:"fc5a246d-dfc3-43e7-b2c7-409da1aecc92", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69897db5bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"calico-kube-controllers-69897db5bb-bgndc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib448b98596f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.171 [INFO][4749] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.70/32] ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.172 [INFO][4749] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib448b98596f ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.175 [INFO][4749] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.179 [INFO][4749] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0", GenerateName:"calico-kube-controllers-69897db5bb-", Namespace:"calico-system", SelfLink:"", UID:"fc5a246d-dfc3-43e7-b2c7-409da1aecc92", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69897db5bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7", Pod:"calico-kube-controllers-69897db5bb-bgndc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib448b98596f", MAC:"52:47:10:53:62:78", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.197199 containerd[1623]: 2026-01-15 23:47:06.192 [INFO][4749] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" Namespace="calico-system" Pod="calico-kube-controllers-69897db5bb-bgndc" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--kube--controllers--69897db5bb--bgndc-eth0" Jan 15 23:47:06.203263 containerd[1623]: time="2026-01-15T23:47:06.203205431Z" level=info msg="Container 6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:47:06.210958 containerd[1623]: time="2026-01-15T23:47:06.210905829Z" level=info msg="CreateContainer within sandbox \"2601b12c1a8b96d1b631fd87f20f624e826892756b1fb5e3a1bb0000c6e2d39e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087\"" Jan 15 23:47:06.211382 containerd[1623]: time="2026-01-15T23:47:06.211362231Z" level=info msg="StartContainer for \"6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087\"" Jan 15 23:47:06.212533 containerd[1623]: time="2026-01-15T23:47:06.212479996Z" level=info msg="connecting to shim 6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087" address="unix:///run/containerd/s/4916612ef30a44a5af211380a1f45e771de165c4181e54e435b8add452e031d2" protocol=ttrpc version=3 Jan 15 23:47:06.227363 containerd[1623]: time="2026-01-15T23:47:06.227281268Z" level=info msg="connecting to shim 8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7" address="unix:///run/containerd/s/b2309cb6d0a59149cd0b2f16607a1dddffe00fa530537a8a475ec94513fcf25b" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:06.240703 systemd[1]: Started cri-containerd-6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087.scope - libcontainer container 6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087. Jan 15 23:47:06.249697 systemd[1]: Started cri-containerd-8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7.scope - libcontainer container 8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7. Jan 15 23:47:06.283713 containerd[1623]: time="2026-01-15T23:47:06.283591380Z" level=info msg="StartContainer for \"6abab843d1afe455427cc632e331eb8968f6f496422c5cd99191099b429e2087\" returns successfully" Jan 15 23:47:06.288769 systemd-networkd[1522]: cali4c1c7b99222: Link UP Jan 15 23:47:06.288972 systemd-networkd[1522]: cali4c1c7b99222: Gained carrier Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.001 [INFO][4764] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0 goldmane-666569f655- calico-system c476e4d7-47c4-4b45-afc2-049d681292b9 825 0 2026-01-15 23:46:36 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 goldmane-666569f655-t2blj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4c1c7b99222 [] [] }} ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.001 [INFO][4764] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.045 [INFO][4807] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" HandleID="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Workload="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.045 [INFO][4807] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" HandleID="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Workload="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-b7ec270451", "pod":"goldmane-666569f655-t2blj", "timestamp":"2026-01-15 23:47:06.04555639 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.045 [INFO][4807] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.169 [INFO][4807] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.169 [INFO][4807] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.235 [INFO][4807] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.242 [INFO][4807] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.251 [INFO][4807] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.255 [INFO][4807] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.258 [INFO][4807] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.258 [INFO][4807] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.260 [INFO][4807] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567 Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.268 [INFO][4807] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4807] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.71/26] block=192.168.85.64/26 handle="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4807] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.71/26] handle="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4807] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:06.311774 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4807] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.71/26] IPv6=[] ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" HandleID="k8s-pod-network.20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Workload="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.283 [INFO][4764] cni-plugin/k8s.go 418: Populated endpoint ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c476e4d7-47c4-4b45-afc2-049d681292b9", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"goldmane-666569f655-t2blj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.85.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4c1c7b99222", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.284 [INFO][4764] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.71/32] ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.284 [INFO][4764] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4c1c7b99222 ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.288 [INFO][4764] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.288 [INFO][4764] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c476e4d7-47c4-4b45-afc2-049d681292b9", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567", Pod:"goldmane-666569f655-t2blj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.85.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4c1c7b99222", MAC:"c2:67:f3:b4:9f:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.313107 containerd[1623]: 2026-01-15 23:47:06.309 [INFO][4764] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" Namespace="calico-system" Pod="goldmane-666569f655-t2blj" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-goldmane--666569f655--t2blj-eth0" Jan 15 23:47:06.318394 containerd[1623]: time="2026-01-15T23:47:06.318342588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69897db5bb-bgndc,Uid:fc5a246d-dfc3-43e7-b2c7-409da1aecc92,Namespace:calico-system,Attempt:0,} returns sandbox id \"8ac53e3f8d2ff6aa0253ea49e7b5019ccff0ac29d825e4856239f556bfcf19b7\"" Jan 15 23:47:06.321197 containerd[1623]: time="2026-01-15T23:47:06.320983240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 23:47:06.321305 kubelet[2882]: I0115 23:47:06.321003 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 23:47:06.364090 containerd[1623]: time="2026-01-15T23:47:06.364041808Z" level=info msg="connecting to shim 20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567" address="unix:///run/containerd/s/fb6dd4e8a7bf31f72ada8201cfa4c97d8acf8076588a79ec3b79d1ba39ba4c92" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:06.383913 systemd-networkd[1522]: cali2a5a9fa016e: Link UP Jan 15 23:47:06.384646 systemd-networkd[1522]: cali2a5a9fa016e: Gained carrier Jan 15 23:47:06.392682 systemd[1]: Started cri-containerd-20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567.scope - libcontainer container 20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567. Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.001 [INFO][4730] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0 calico-apiserver-76998f65d4- calico-apiserver 717a4ddd-81e7-49b2-b875-208b71de27d1 821 0 2026-01-15 23:46:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76998f65d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-b7ec270451 calico-apiserver-76998f65d4-lbk6r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2a5a9fa016e [] [] }} ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.001 [INFO][4730] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.053 [INFO][4806] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" HandleID="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.053 [INFO][4806] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" HandleID="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3260), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-b7ec270451", "pod":"calico-apiserver-76998f65d4-lbk6r", "timestamp":"2026-01-15 23:47:06.053223827 +0000 UTC"}, Hostname:"ci-4459-2-2-n-b7ec270451", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.053 [INFO][4806] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4806] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.279 [INFO][4806] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-b7ec270451' Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.337 [INFO][4806] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.345 [INFO][4806] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.352 [INFO][4806] ipam/ipam.go 511: Trying affinity for 192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.355 [INFO][4806] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.358 [INFO][4806] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.64/26 host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.358 [INFO][4806] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.64/26 handle="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.361 [INFO][4806] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.367 [INFO][4806] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.64/26 handle="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.375 [INFO][4806] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.72/26] block=192.168.85.64/26 handle="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.375 [INFO][4806] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.72/26] handle="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" host="ci-4459-2-2-n-b7ec270451" Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.375 [INFO][4806] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 23:47:06.413334 containerd[1623]: 2026-01-15 23:47:06.375 [INFO][4806] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.72/26] IPv6=[] ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" HandleID="k8s-pod-network.6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Workload="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.380 [INFO][4730] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0", GenerateName:"calico-apiserver-76998f65d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"717a4ddd-81e7-49b2-b875-208b71de27d1", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76998f65d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"", Pod:"calico-apiserver-76998f65d4-lbk6r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2a5a9fa016e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.380 [INFO][4730] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.72/32] ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.380 [INFO][4730] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2a5a9fa016e ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.384 [INFO][4730] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.386 [INFO][4730] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0", GenerateName:"calico-apiserver-76998f65d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"717a4ddd-81e7-49b2-b875-208b71de27d1", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 23, 46, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76998f65d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-b7ec270451", ContainerID:"6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d", Pod:"calico-apiserver-76998f65d4-lbk6r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2a5a9fa016e", MAC:"f2:e5:7c:f4:01:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 23:47:06.413839 containerd[1623]: 2026-01-15 23:47:06.403 [INFO][4730] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" Namespace="calico-apiserver" Pod="calico-apiserver-76998f65d4-lbk6r" WorkloadEndpoint="ci--4459--2--2--n--b7ec270451-k8s-calico--apiserver--76998f65d4--lbk6r-eth0" Jan 15 23:47:06.453331 containerd[1623]: time="2026-01-15T23:47:06.453282440Z" level=info msg="connecting to shim 6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d" address="unix:///run/containerd/s/a7ef03781fb1db453ce3ea56cfc1d0cf04b950e2e820b9426e4e0239bbe19aec" namespace=k8s.io protocol=ttrpc version=3 Jan 15 23:47:06.483646 systemd[1]: Started cri-containerd-6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d.scope - libcontainer container 6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d. Jan 15 23:47:06.496006 containerd[1623]: time="2026-01-15T23:47:06.495688404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-t2blj,Uid:c476e4d7-47c4-4b45-afc2-049d681292b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"20797ba677e5061910e33a3f625ec1f656023668a50d5dfe5b0bb98d207a2567\"" Jan 15 23:47:06.532575 containerd[1623]: time="2026-01-15T23:47:06.532508022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76998f65d4-lbk6r,Uid:717a4ddd-81e7-49b2-b875-208b71de27d1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6eb7b033e5976252ac2ff87fdf28920d3fc9ed52ee19dc5b0b55a7fb26b31c0d\"" Jan 15 23:47:06.662499 containerd[1623]: time="2026-01-15T23:47:06.662452890Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:06.663829 containerd[1623]: time="2026-01-15T23:47:06.663724776Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 23:47:06.663829 containerd[1623]: time="2026-01-15T23:47:06.663782856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 15 23:47:06.664016 kubelet[2882]: E0115 23:47:06.663959 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:06.664126 kubelet[2882]: E0115 23:47:06.664012 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:06.664319 kubelet[2882]: E0115 23:47:06.664263 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kbff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:06.664497 containerd[1623]: time="2026-01-15T23:47:06.664473660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 23:47:06.665561 kubelet[2882]: E0115 23:47:06.665522 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:06.988052 containerd[1623]: time="2026-01-15T23:47:06.987953583Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:06.990370 containerd[1623]: time="2026-01-15T23:47:06.990318434Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 23:47:06.990418 containerd[1623]: time="2026-01-15T23:47:06.990407554Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:06.990685 kubelet[2882]: E0115 23:47:06.990617 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:06.990750 kubelet[2882]: E0115 23:47:06.990712 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:06.991017 kubelet[2882]: E0115 23:47:06.990962 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9n5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:06.991220 containerd[1623]: time="2026-01-15T23:47:06.991190918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:06.992338 kubelet[2882]: E0115 23:47:06.992301 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:07.042096 kubelet[2882]: E0115 23:47:07.042044 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:07.043135 kubelet[2882]: E0115 23:47:07.043050 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:07.087122 kubelet[2882]: I0115 23:47:07.087054 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zzfg8" podStartSLOduration=45.087035701 podStartE2EDuration="45.087035701s" podCreationTimestamp="2026-01-15 23:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 23:47:07.08473781 +0000 UTC m=+51.284733195" watchObservedRunningTime="2026-01-15 23:47:07.087035701 +0000 UTC m=+51.287031046" Jan 15 23:47:07.326878 containerd[1623]: time="2026-01-15T23:47:07.326704739Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:07.328418 containerd[1623]: time="2026-01-15T23:47:07.328337787Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:07.328418 containerd[1623]: time="2026-01-15T23:47:07.328392987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:07.328634 kubelet[2882]: E0115 23:47:07.328550 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:07.328634 kubelet[2882]: E0115 23:47:07.328595 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:07.329016 kubelet[2882]: E0115 23:47:07.328718 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l7vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:07.330781 kubelet[2882]: E0115 23:47:07.330745 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:47:07.495664 systemd-networkd[1522]: calib448b98596f: Gained IPv6LL Jan 15 23:47:07.879546 systemd-networkd[1522]: cali4c1c7b99222: Gained IPv6LL Jan 15 23:47:07.880423 systemd-networkd[1522]: cali2a5a9fa016e: Gained IPv6LL Jan 15 23:47:08.045798 kubelet[2882]: E0115 23:47:08.045741 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:08.046006 kubelet[2882]: E0115 23:47:08.045840 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:47:08.046469 kubelet[2882]: E0115 23:47:08.046414 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:08.135581 systemd-networkd[1522]: cali90822a50ead: Gained IPv6LL Jan 15 23:47:12.882480 containerd[1623]: time="2026-01-15T23:47:12.882427499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 23:47:13.225745 containerd[1623]: time="2026-01-15T23:47:13.225574796Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:13.235377 containerd[1623]: time="2026-01-15T23:47:13.233829516Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 23:47:13.235377 containerd[1623]: time="2026-01-15T23:47:13.233920197Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 15 23:47:13.235563 kubelet[2882]: E0115 23:47:13.234108 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:47:13.235563 kubelet[2882]: E0115 23:47:13.234164 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:47:13.235563 kubelet[2882]: E0115 23:47:13.234973 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:3d31bd98c0c242218cb5129b8a984197,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:13.240060 containerd[1623]: time="2026-01-15T23:47:13.240016986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 23:47:13.577210 containerd[1623]: time="2026-01-15T23:47:13.576998174Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:13.578815 containerd[1623]: time="2026-01-15T23:47:13.578710262Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 23:47:13.578815 containerd[1623]: time="2026-01-15T23:47:13.578788183Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 15 23:47:13.579056 kubelet[2882]: E0115 23:47:13.579016 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:47:13.579116 kubelet[2882]: E0115 23:47:13.579068 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:47:13.579224 kubelet[2882]: E0115 23:47:13.579183 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:13.580380 kubelet[2882]: E0115 23:47:13.580319 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:47:17.882483 containerd[1623]: time="2026-01-15T23:47:17.882057252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:18.237637 containerd[1623]: time="2026-01-15T23:47:18.237585889Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:18.238897 containerd[1623]: time="2026-01-15T23:47:18.238840335Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:18.238962 containerd[1623]: time="2026-01-15T23:47:18.238896776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:18.239165 kubelet[2882]: E0115 23:47:18.239128 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:18.239703 kubelet[2882]: E0115 23:47:18.239498 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:18.239808 kubelet[2882]: E0115 23:47:18.239686 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4zpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:18.241057 kubelet[2882]: E0115 23:47:18.240963 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:18.882312 containerd[1623]: time="2026-01-15T23:47:18.882267204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 23:47:19.217334 containerd[1623]: time="2026-01-15T23:47:19.217270462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:19.218680 containerd[1623]: time="2026-01-15T23:47:19.218630029Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 23:47:19.218787 containerd[1623]: time="2026-01-15T23:47:19.218732389Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 15 23:47:19.218924 kubelet[2882]: E0115 23:47:19.218884 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:19.218987 kubelet[2882]: E0115 23:47:19.218937 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:19.219119 kubelet[2882]: E0115 23:47:19.219081 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:19.221414 containerd[1623]: time="2026-01-15T23:47:19.221383482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 23:47:19.565538 containerd[1623]: time="2026-01-15T23:47:19.565354744Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:19.567899 containerd[1623]: time="2026-01-15T23:47:19.567842796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 23:47:19.567997 containerd[1623]: time="2026-01-15T23:47:19.567935556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 15 23:47:19.568160 kubelet[2882]: E0115 23:47:19.568103 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:19.568473 kubelet[2882]: E0115 23:47:19.568161 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:19.568473 kubelet[2882]: E0115 23:47:19.568274 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:19.569843 kubelet[2882]: E0115 23:47:19.569774 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:19.885091 containerd[1623]: time="2026-01-15T23:47:19.884939928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 23:47:20.218156 containerd[1623]: time="2026-01-15T23:47:20.218097097Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:20.220463 containerd[1623]: time="2026-01-15T23:47:20.220378108Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 23:47:20.220463 containerd[1623]: time="2026-01-15T23:47:20.220445429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 15 23:47:20.221535 kubelet[2882]: E0115 23:47:20.221492 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:20.221605 kubelet[2882]: E0115 23:47:20.221550 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:20.221727 kubelet[2882]: E0115 23:47:20.221683 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kbff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:20.223839 kubelet[2882]: E0115 23:47:20.223787 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:21.882597 containerd[1623]: time="2026-01-15T23:47:21.882420218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 23:47:22.218872 containerd[1623]: time="2026-01-15T23:47:22.218813483Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:22.220140 containerd[1623]: time="2026-01-15T23:47:22.220086329Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 23:47:22.220226 containerd[1623]: time="2026-01-15T23:47:22.220180689Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:22.220408 kubelet[2882]: E0115 23:47:22.220331 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:22.220408 kubelet[2882]: E0115 23:47:22.220388 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:22.220717 kubelet[2882]: E0115 23:47:22.220530 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9n5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:22.222458 kubelet[2882]: E0115 23:47:22.222008 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:22.882740 containerd[1623]: time="2026-01-15T23:47:22.882702210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:23.214453 containerd[1623]: time="2026-01-15T23:47:23.214186851Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:23.217484 containerd[1623]: time="2026-01-15T23:47:23.217411267Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:23.217610 containerd[1623]: time="2026-01-15T23:47:23.217485147Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:23.217708 kubelet[2882]: E0115 23:47:23.217662 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:23.217756 kubelet[2882]: E0115 23:47:23.217717 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:23.217880 kubelet[2882]: E0115 23:47:23.217836 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l7vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:23.219417 kubelet[2882]: E0115 23:47:23.219345 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:47:23.282764 systemd[1]: Started sshd@11-10.0.10.219:22-209.38.35.85:54930.service - OpenSSH per-connection server daemon (209.38.35.85:54930). Jan 15 23:47:23.356150 sshd[5175]: Connection closed by 209.38.35.85 port 54930 Jan 15 23:47:23.357397 systemd[1]: sshd@11-10.0.10.219:22-209.38.35.85:54930.service: Deactivated successfully. Jan 15 23:47:26.881891 kubelet[2882]: E0115 23:47:26.881791 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:47:32.882165 kubelet[2882]: E0115 23:47:32.882113 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:32.884614 kubelet[2882]: E0115 23:47:32.884553 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:33.882557 kubelet[2882]: E0115 23:47:33.882510 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:33.883520 kubelet[2882]: E0115 23:47:33.883480 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:34.881248 kubelet[2882]: E0115 23:47:34.881175 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:47:38.882927 containerd[1623]: time="2026-01-15T23:47:38.882704946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 23:47:39.229109 containerd[1623]: time="2026-01-15T23:47:39.228931498Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:39.234277 containerd[1623]: time="2026-01-15T23:47:39.234160564Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 23:47:39.234277 containerd[1623]: time="2026-01-15T23:47:39.234223164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 15 23:47:39.234459 kubelet[2882]: E0115 23:47:39.234388 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:47:39.234713 kubelet[2882]: E0115 23:47:39.234488 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:47:39.234713 kubelet[2882]: E0115 23:47:39.234627 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:3d31bd98c0c242218cb5129b8a984197,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:39.236873 containerd[1623]: time="2026-01-15T23:47:39.236834376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 23:47:39.581074 containerd[1623]: time="2026-01-15T23:47:39.580936119Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:39.585780 containerd[1623]: time="2026-01-15T23:47:39.585708582Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 23:47:39.585930 containerd[1623]: time="2026-01-15T23:47:39.585810262Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 15 23:47:39.586000 kubelet[2882]: E0115 23:47:39.585946 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:47:39.586073 kubelet[2882]: E0115 23:47:39.585998 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:47:39.586177 kubelet[2882]: E0115 23:47:39.586134 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:39.587720 kubelet[2882]: E0115 23:47:39.587628 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:47:44.881974 containerd[1623]: time="2026-01-15T23:47:44.881939088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 23:47:45.231301 containerd[1623]: time="2026-01-15T23:47:45.231226415Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:45.232786 containerd[1623]: time="2026-01-15T23:47:45.232695022Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 23:47:45.232786 containerd[1623]: time="2026-01-15T23:47:45.232740743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 15 23:47:45.233145 kubelet[2882]: E0115 23:47:45.233086 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:45.233715 kubelet[2882]: E0115 23:47:45.233498 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:47:45.233715 kubelet[2882]: E0115 23:47:45.233654 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:45.236384 containerd[1623]: time="2026-01-15T23:47:45.236336080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 23:47:45.762208 containerd[1623]: time="2026-01-15T23:47:45.761053735Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:45.764684 containerd[1623]: time="2026-01-15T23:47:45.764558472Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 23:47:45.764684 containerd[1623]: time="2026-01-15T23:47:45.764617992Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 15 23:47:45.764824 kubelet[2882]: E0115 23:47:45.764776 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:45.764824 kubelet[2882]: E0115 23:47:45.764821 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:47:45.765060 kubelet[2882]: E0115 23:47:45.765019 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:45.766667 kubelet[2882]: E0115 23:47:45.766620 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:47:45.883851 containerd[1623]: time="2026-01-15T23:47:45.883810128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:46.411528 containerd[1623]: time="2026-01-15T23:47:46.411422557Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:46.412997 containerd[1623]: time="2026-01-15T23:47:46.412922164Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:46.412997 containerd[1623]: time="2026-01-15T23:47:46.412981084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:46.413241 kubelet[2882]: E0115 23:47:46.413187 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:46.413531 kubelet[2882]: E0115 23:47:46.413246 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:46.413531 kubelet[2882]: E0115 23:47:46.413376 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4zpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:46.415366 kubelet[2882]: E0115 23:47:46.415321 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:46.882637 containerd[1623]: time="2026-01-15T23:47:46.882219511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 23:47:47.239858 containerd[1623]: time="2026-01-15T23:47:47.239748238Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:47.241200 containerd[1623]: time="2026-01-15T23:47:47.241133805Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 23:47:47.241265 containerd[1623]: time="2026-01-15T23:47:47.241232766Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:47.241523 kubelet[2882]: E0115 23:47:47.241422 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:47.241523 kubelet[2882]: E0115 23:47:47.241500 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:47:47.242346 kubelet[2882]: E0115 23:47:47.241826 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9n5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:47.243406 kubelet[2882]: E0115 23:47:47.243367 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:47:47.243556 containerd[1623]: time="2026-01-15T23:47:47.243529777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:47:47.565956 containerd[1623]: time="2026-01-15T23:47:47.565837414Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:47.568821 containerd[1623]: time="2026-01-15T23:47:47.568770748Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:47:47.568988 containerd[1623]: time="2026-01-15T23:47:47.568862788Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:47:47.569208 kubelet[2882]: E0115 23:47:47.569157 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:47.569988 kubelet[2882]: E0115 23:47:47.569206 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:47:47.569988 kubelet[2882]: E0115 23:47:47.569415 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l7vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:47.570100 containerd[1623]: time="2026-01-15T23:47:47.569698432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 23:47:47.570856 kubelet[2882]: E0115 23:47:47.570802 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:47:47.923799 containerd[1623]: time="2026-01-15T23:47:47.923584822Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:47:47.925609 containerd[1623]: time="2026-01-15T23:47:47.925569352Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 23:47:47.925727 containerd[1623]: time="2026-01-15T23:47:47.925650912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 15 23:47:47.925816 kubelet[2882]: E0115 23:47:47.925778 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:47.925859 kubelet[2882]: E0115 23:47:47.925827 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:47:47.926013 kubelet[2882]: E0115 23:47:47.925965 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kbff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 23:47:47.927421 kubelet[2882]: E0115 23:47:47.927380 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:50.882310 kubelet[2882]: E0115 23:47:50.882226 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:47:59.882263 kubelet[2882]: E0115 23:47:59.882173 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:47:59.883467 kubelet[2882]: E0115 23:47:59.883404 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:47:59.884155 kubelet[2882]: E0115 23:47:59.884113 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:48:00.882497 kubelet[2882]: E0115 23:48:00.882385 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:48:02.882296 kubelet[2882]: E0115 23:48:02.881923 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:48:04.881893 kubelet[2882]: E0115 23:48:04.881847 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:48:10.882460 kubelet[2882]: E0115 23:48:10.882401 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:48:10.882899 kubelet[2882]: E0115 23:48:10.882493 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:48:13.883685 kubelet[2882]: E0115 23:48:13.883638 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:48:14.881288 kubelet[2882]: E0115 23:48:14.881224 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:48:15.883468 kubelet[2882]: E0115 23:48:15.883358 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:48:15.884597 kubelet[2882]: E0115 23:48:15.884534 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:48:20.217922 systemd[1]: Started sshd@12-10.0.10.219:22-209.38.35.85:49200.service - OpenSSH per-connection server daemon (209.38.35.85:49200). Jan 15 23:48:20.322887 sshd[5254]: Connection closed by authenticating user root 209.38.35.85 port 49200 [preauth] Jan 15 23:48:20.324676 systemd[1]: sshd@12-10.0.10.219:22-209.38.35.85:49200.service: Deactivated successfully. Jan 15 23:48:22.882117 kubelet[2882]: E0115 23:48:22.882054 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:48:24.881121 kubelet[2882]: E0115 23:48:24.881068 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:48:26.881214 kubelet[2882]: E0115 23:48:26.881159 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:48:27.881756 kubelet[2882]: E0115 23:48:27.881556 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:48:28.881778 containerd[1623]: time="2026-01-15T23:48:28.881619450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 23:48:29.221611 containerd[1623]: time="2026-01-15T23:48:29.221553892Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:29.225723 containerd[1623]: time="2026-01-15T23:48:29.225670152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 15 23:48:29.225863 containerd[1623]: time="2026-01-15T23:48:29.225748553Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 23:48:29.226714 kubelet[2882]: E0115 23:48:29.226568 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:48:29.226714 kubelet[2882]: E0115 23:48:29.226619 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:48:29.227104 kubelet[2882]: E0115 23:48:29.226754 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9n5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:29.228832 kubelet[2882]: E0115 23:48:29.228785 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:48:29.882533 containerd[1623]: time="2026-01-15T23:48:29.882483325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 23:48:30.158198 systemd[1]: Started sshd@13-10.0.10.219:22-68.220.241.50:53968.service - OpenSSH per-connection server daemon (68.220.241.50:53968). Jan 15 23:48:30.218643 containerd[1623]: time="2026-01-15T23:48:30.218587789Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:30.220036 containerd[1623]: time="2026-01-15T23:48:30.219970036Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 23:48:30.220143 containerd[1623]: time="2026-01-15T23:48:30.220071396Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 15 23:48:30.220291 kubelet[2882]: E0115 23:48:30.220226 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:48:30.220291 kubelet[2882]: E0115 23:48:30.220286 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:48:30.220440 kubelet[2882]: E0115 23:48:30.220403 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:3d31bd98c0c242218cb5129b8a984197,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:30.222449 containerd[1623]: time="2026-01-15T23:48:30.222395167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 23:48:30.553184 containerd[1623]: time="2026-01-15T23:48:30.553125725Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:30.554639 containerd[1623]: time="2026-01-15T23:48:30.554574612Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 23:48:30.554768 containerd[1623]: time="2026-01-15T23:48:30.554632253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 15 23:48:30.554856 kubelet[2882]: E0115 23:48:30.554808 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:48:30.555248 kubelet[2882]: E0115 23:48:30.554861 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:48:30.555248 kubelet[2882]: E0115 23:48:30.554979 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:30.556106 kubelet[2882]: E0115 23:48:30.556066 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:48:30.781555 sshd[5274]: Accepted publickey for core from 68.220.241.50 port 53968 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:30.783294 sshd-session[5274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:30.787512 systemd-logind[1599]: New session 12 of user core. Jan 15 23:48:30.797672 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 15 23:48:31.277131 sshd[5277]: Connection closed by 68.220.241.50 port 53968 Jan 15 23:48:31.277651 sshd-session[5274]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:31.282013 systemd[1]: sshd@13-10.0.10.219:22-68.220.241.50:53968.service: Deactivated successfully. Jan 15 23:48:31.283868 systemd[1]: session-12.scope: Deactivated successfully. Jan 15 23:48:31.285280 systemd-logind[1599]: Session 12 logged out. Waiting for processes to exit. Jan 15 23:48:31.286399 systemd-logind[1599]: Removed session 12. Jan 15 23:48:35.883954 containerd[1623]: time="2026-01-15T23:48:35.883123314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 23:48:36.218285 containerd[1623]: time="2026-01-15T23:48:36.218241893Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:36.219552 containerd[1623]: time="2026-01-15T23:48:36.219515420Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 23:48:36.219681 containerd[1623]: time="2026-01-15T23:48:36.219597700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 15 23:48:36.219821 kubelet[2882]: E0115 23:48:36.219751 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:48:36.219821 kubelet[2882]: E0115 23:48:36.219816 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 23:48:36.220144 kubelet[2882]: E0115 23:48:36.219939 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:36.222725 containerd[1623]: time="2026-01-15T23:48:36.222668395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 23:48:36.384283 systemd[1]: Started sshd@14-10.0.10.219:22-68.220.241.50:55264.service - OpenSSH per-connection server daemon (68.220.241.50:55264). Jan 15 23:48:36.549648 containerd[1623]: time="2026-01-15T23:48:36.549524534Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:36.551544 containerd[1623]: time="2026-01-15T23:48:36.551488623Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 23:48:36.551714 containerd[1623]: time="2026-01-15T23:48:36.551516863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 15 23:48:36.551803 kubelet[2882]: E0115 23:48:36.551763 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:48:36.551856 kubelet[2882]: E0115 23:48:36.551816 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 23:48:36.551975 kubelet[2882]: E0115 23:48:36.551936 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqb9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-zqjh8_calico-system(fee8d1af-3972-419e-8500-84b3b6b46b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:36.553710 kubelet[2882]: E0115 23:48:36.553649 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:48:37.008403 sshd[5292]: Accepted publickey for core from 68.220.241.50 port 55264 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:37.009703 sshd-session[5292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:37.015491 systemd-logind[1599]: New session 13 of user core. Jan 15 23:48:37.024813 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 15 23:48:37.502318 sshd[5319]: Connection closed by 68.220.241.50 port 55264 Jan 15 23:48:37.502722 sshd-session[5292]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:37.507868 systemd[1]: sshd@14-10.0.10.219:22-68.220.241.50:55264.service: Deactivated successfully. Jan 15 23:48:37.510417 systemd[1]: session-13.scope: Deactivated successfully. Jan 15 23:48:37.512665 systemd-logind[1599]: Session 13 logged out. Waiting for processes to exit. Jan 15 23:48:37.515724 systemd-logind[1599]: Removed session 13. Jan 15 23:48:37.883934 containerd[1623]: time="2026-01-15T23:48:37.883819180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:48:38.215704 containerd[1623]: time="2026-01-15T23:48:38.215639823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:38.217496 containerd[1623]: time="2026-01-15T23:48:38.217428511Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:48:38.217582 containerd[1623]: time="2026-01-15T23:48:38.217502152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:48:38.217739 kubelet[2882]: E0115 23:48:38.217679 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:48:38.217739 kubelet[2882]: E0115 23:48:38.217732 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:48:38.218053 kubelet[2882]: E0115 23:48:38.217926 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l7vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:38.218553 containerd[1623]: time="2026-01-15T23:48:38.218525437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:48:38.220533 kubelet[2882]: E0115 23:48:38.219746 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:48:38.553226 containerd[1623]: time="2026-01-15T23:48:38.553083493Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:38.554933 containerd[1623]: time="2026-01-15T23:48:38.554881182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:48:38.555050 containerd[1623]: time="2026-01-15T23:48:38.554949662Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:48:38.555124 kubelet[2882]: E0115 23:48:38.555085 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:48:38.555172 kubelet[2882]: E0115 23:48:38.555133 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:48:38.555307 kubelet[2882]: E0115 23:48:38.555261 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4zpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-hpm9k_calico-apiserver(ff12f75f-0f47-4e03-a069-ef3f612b51b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:38.556447 kubelet[2882]: E0115 23:48:38.556399 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:48:40.882575 kubelet[2882]: E0115 23:48:40.882501 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:48:40.883204 kubelet[2882]: E0115 23:48:40.883036 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:48:42.614237 systemd[1]: Started sshd@15-10.0.10.219:22-68.220.241.50:40244.service - OpenSSH per-connection server daemon (68.220.241.50:40244). Jan 15 23:48:42.882702 containerd[1623]: time="2026-01-15T23:48:42.882588849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 23:48:43.212358 containerd[1623]: time="2026-01-15T23:48:43.212314122Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:48:43.214622 containerd[1623]: time="2026-01-15T23:48:43.214564013Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 23:48:43.214733 containerd[1623]: time="2026-01-15T23:48:43.214668573Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 15 23:48:43.215002 kubelet[2882]: E0115 23:48:43.214911 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:48:43.215655 kubelet[2882]: E0115 23:48:43.214992 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 23:48:43.215655 kubelet[2882]: E0115 23:48:43.215493 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kbff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69897db5bb-bgndc_calico-system(fc5a246d-dfc3-43e7-b2c7-409da1aecc92): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 23:48:43.217047 kubelet[2882]: E0115 23:48:43.216769 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:48:43.239949 sshd[5358]: Accepted publickey for core from 68.220.241.50 port 40244 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:43.241877 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:43.248505 systemd-logind[1599]: New session 14 of user core. Jan 15 23:48:43.252874 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 15 23:48:43.732892 sshd[5361]: Connection closed by 68.220.241.50 port 40244 Jan 15 23:48:43.733639 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:43.737475 systemd[1]: sshd@15-10.0.10.219:22-68.220.241.50:40244.service: Deactivated successfully. Jan 15 23:48:43.740858 systemd[1]: session-14.scope: Deactivated successfully. Jan 15 23:48:43.741748 systemd-logind[1599]: Session 14 logged out. Waiting for processes to exit. Jan 15 23:48:43.742762 systemd-logind[1599]: Removed session 14. Jan 15 23:48:43.841463 systemd[1]: Started sshd@16-10.0.10.219:22-68.220.241.50:40258.service - OpenSSH per-connection server daemon (68.220.241.50:40258). Jan 15 23:48:44.464621 sshd[5376]: Accepted publickey for core from 68.220.241.50 port 40258 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:44.467413 sshd-session[5376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:44.473758 systemd-logind[1599]: New session 15 of user core. Jan 15 23:48:44.482651 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 15 23:48:44.998975 sshd[5379]: Connection closed by 68.220.241.50 port 40258 Jan 15 23:48:44.999725 sshd-session[5376]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:45.004237 systemd[1]: sshd@16-10.0.10.219:22-68.220.241.50:40258.service: Deactivated successfully. Jan 15 23:48:45.008446 systemd[1]: session-15.scope: Deactivated successfully. Jan 15 23:48:45.010034 systemd-logind[1599]: Session 15 logged out. Waiting for processes to exit. Jan 15 23:48:45.012273 systemd-logind[1599]: Removed session 15. Jan 15 23:48:45.111474 systemd[1]: Started sshd@17-10.0.10.219:22-68.220.241.50:40272.service - OpenSSH per-connection server daemon (68.220.241.50:40272). Jan 15 23:48:45.728309 sshd[5390]: Accepted publickey for core from 68.220.241.50 port 40272 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:45.729642 sshd-session[5390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:45.733352 systemd-logind[1599]: New session 16 of user core. Jan 15 23:48:45.744689 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 15 23:48:46.217813 sshd[5393]: Connection closed by 68.220.241.50 port 40272 Jan 15 23:48:46.218258 sshd-session[5390]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:46.223150 systemd[1]: sshd@17-10.0.10.219:22-68.220.241.50:40272.service: Deactivated successfully. Jan 15 23:48:46.226198 systemd[1]: session-16.scope: Deactivated successfully. Jan 15 23:48:46.227289 systemd-logind[1599]: Session 16 logged out. Waiting for processes to exit. Jan 15 23:48:46.228378 systemd-logind[1599]: Removed session 16. Jan 15 23:48:47.883118 kubelet[2882]: E0115 23:48:47.882730 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:48:51.329040 systemd[1]: Started sshd@18-10.0.10.219:22-68.220.241.50:40288.service - OpenSSH per-connection server daemon (68.220.241.50:40288). Jan 15 23:48:51.882862 kubelet[2882]: E0115 23:48:51.882789 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:48:51.954830 sshd[5408]: Accepted publickey for core from 68.220.241.50 port 40288 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:51.956122 sshd-session[5408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:51.959847 systemd-logind[1599]: New session 17 of user core. Jan 15 23:48:51.976652 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 15 23:48:52.446495 sshd[5411]: Connection closed by 68.220.241.50 port 40288 Jan 15 23:48:52.447423 sshd-session[5408]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:52.451500 systemd[1]: sshd@18-10.0.10.219:22-68.220.241.50:40288.service: Deactivated successfully. Jan 15 23:48:52.453985 systemd[1]: session-17.scope: Deactivated successfully. Jan 15 23:48:52.455128 systemd-logind[1599]: Session 17 logged out. Waiting for processes to exit. Jan 15 23:48:52.456375 systemd-logind[1599]: Removed session 17. Jan 15 23:48:52.552852 systemd[1]: Started sshd@19-10.0.10.219:22-68.220.241.50:48416.service - OpenSSH per-connection server daemon (68.220.241.50:48416). Jan 15 23:48:52.882841 kubelet[2882]: E0115 23:48:52.882794 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:48:53.168138 sshd[5425]: Accepted publickey for core from 68.220.241.50 port 48416 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:53.169343 sshd-session[5425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:53.174872 systemd-logind[1599]: New session 18 of user core. Jan 15 23:48:53.182613 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 15 23:48:53.711688 sshd[5430]: Connection closed by 68.220.241.50 port 48416 Jan 15 23:48:53.712286 sshd-session[5425]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:53.715834 systemd-logind[1599]: Session 18 logged out. Waiting for processes to exit. Jan 15 23:48:53.716212 systemd[1]: sshd@19-10.0.10.219:22-68.220.241.50:48416.service: Deactivated successfully. Jan 15 23:48:53.719547 systemd[1]: session-18.scope: Deactivated successfully. Jan 15 23:48:53.721909 systemd-logind[1599]: Removed session 18. Jan 15 23:48:53.819793 systemd[1]: Started sshd@20-10.0.10.219:22-68.220.241.50:48426.service - OpenSSH per-connection server daemon (68.220.241.50:48426). Jan 15 23:48:53.882387 kubelet[2882]: E0115 23:48:53.882330 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:48:53.883205 kubelet[2882]: E0115 23:48:53.883071 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:48:54.442976 sshd[5441]: Accepted publickey for core from 68.220.241.50 port 48426 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:54.444981 sshd-session[5441]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:54.448880 systemd-logind[1599]: New session 19 of user core. Jan 15 23:48:54.458661 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 15 23:48:54.881336 kubelet[2882]: E0115 23:48:54.881228 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:48:55.547511 sshd[5444]: Connection closed by 68.220.241.50 port 48426 Jan 15 23:48:55.548058 sshd-session[5441]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:55.551727 systemd[1]: sshd@20-10.0.10.219:22-68.220.241.50:48426.service: Deactivated successfully. Jan 15 23:48:55.553471 systemd[1]: session-19.scope: Deactivated successfully. Jan 15 23:48:55.554125 systemd-logind[1599]: Session 19 logged out. Waiting for processes to exit. Jan 15 23:48:55.555372 systemd-logind[1599]: Removed session 19. Jan 15 23:48:55.656637 systemd[1]: Started sshd@21-10.0.10.219:22-68.220.241.50:48430.service - OpenSSH per-connection server daemon (68.220.241.50:48430). Jan 15 23:48:56.272658 sshd[5464]: Accepted publickey for core from 68.220.241.50 port 48430 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:56.274144 sshd-session[5464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:56.279961 systemd-logind[1599]: New session 20 of user core. Jan 15 23:48:56.285980 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 15 23:48:56.915934 sshd[5467]: Connection closed by 68.220.241.50 port 48430 Jan 15 23:48:56.916471 sshd-session[5464]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:56.920049 systemd[1]: sshd@21-10.0.10.219:22-68.220.241.50:48430.service: Deactivated successfully. Jan 15 23:48:56.922967 systemd[1]: session-20.scope: Deactivated successfully. Jan 15 23:48:56.923734 systemd-logind[1599]: Session 20 logged out. Waiting for processes to exit. Jan 15 23:48:56.926046 systemd-logind[1599]: Removed session 20. Jan 15 23:48:57.023037 systemd[1]: Started sshd@22-10.0.10.219:22-68.220.241.50:48442.service - OpenSSH per-connection server daemon (68.220.241.50:48442). Jan 15 23:48:57.645250 sshd[5478]: Accepted publickey for core from 68.220.241.50 port 48442 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:48:57.647386 sshd-session[5478]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:48:57.652087 systemd-logind[1599]: New session 21 of user core. Jan 15 23:48:57.665642 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 15 23:48:58.149205 sshd[5481]: Connection closed by 68.220.241.50 port 48442 Jan 15 23:48:58.149731 sshd-session[5478]: pam_unix(sshd:session): session closed for user core Jan 15 23:48:58.153254 systemd[1]: sshd@22-10.0.10.219:22-68.220.241.50:48442.service: Deactivated successfully. Jan 15 23:48:58.155190 systemd[1]: session-21.scope: Deactivated successfully. Jan 15 23:48:58.156092 systemd-logind[1599]: Session 21 logged out. Waiting for processes to exit. Jan 15 23:48:58.157635 systemd-logind[1599]: Removed session 21. Jan 15 23:48:58.883298 kubelet[2882]: E0115 23:48:58.883077 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:49:00.292010 systemd[1]: Started sshd@23-10.0.10.219:22-209.38.35.85:48874.service - OpenSSH per-connection server daemon (209.38.35.85:48874). Jan 15 23:49:00.387318 sshd[5497]: Connection closed by authenticating user root 209.38.35.85 port 48874 [preauth] Jan 15 23:49:00.389771 systemd[1]: sshd@23-10.0.10.219:22-209.38.35.85:48874.service: Deactivated successfully. Jan 15 23:49:03.260671 systemd[1]: Started sshd@24-10.0.10.219:22-68.220.241.50:51178.service - OpenSSH per-connection server daemon (68.220.241.50:51178). Jan 15 23:49:03.882218 kubelet[2882]: E0115 23:49:03.881610 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:49:03.897520 sshd[5503]: Accepted publickey for core from 68.220.241.50 port 51178 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:49:03.899234 sshd-session[5503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:49:03.903217 systemd-logind[1599]: New session 22 of user core. Jan 15 23:49:03.911939 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 15 23:49:04.413972 sshd[5506]: Connection closed by 68.220.241.50 port 51178 Jan 15 23:49:04.414616 sshd-session[5503]: pam_unix(sshd:session): session closed for user core Jan 15 23:49:04.419038 systemd[1]: sshd@24-10.0.10.219:22-68.220.241.50:51178.service: Deactivated successfully. Jan 15 23:49:04.421683 systemd[1]: session-22.scope: Deactivated successfully. Jan 15 23:49:04.424264 systemd-logind[1599]: Session 22 logged out. Waiting for processes to exit. Jan 15 23:49:04.425653 systemd-logind[1599]: Removed session 22. Jan 15 23:49:06.882872 kubelet[2882]: E0115 23:49:06.882826 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:49:06.883519 kubelet[2882]: E0115 23:49:06.883421 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:49:07.882907 kubelet[2882]: E0115 23:49:07.882608 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:49:07.885038 kubelet[2882]: E0115 23:49:07.884867 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:49:09.528457 systemd[1]: Started sshd@25-10.0.10.219:22-68.220.241.50:51192.service - OpenSSH per-connection server daemon (68.220.241.50:51192). Jan 15 23:49:10.145011 sshd[5545]: Accepted publickey for core from 68.220.241.50 port 51192 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:49:10.146426 sshd-session[5545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:49:10.151510 systemd-logind[1599]: New session 23 of user core. Jan 15 23:49:10.160620 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 15 23:49:10.641453 sshd[5548]: Connection closed by 68.220.241.50 port 51192 Jan 15 23:49:10.641549 sshd-session[5545]: pam_unix(sshd:session): session closed for user core Jan 15 23:49:10.646042 systemd[1]: sshd@25-10.0.10.219:22-68.220.241.50:51192.service: Deactivated successfully. Jan 15 23:49:10.647906 systemd[1]: session-23.scope: Deactivated successfully. Jan 15 23:49:10.648660 systemd-logind[1599]: Session 23 logged out. Waiting for processes to exit. Jan 15 23:49:10.650262 systemd-logind[1599]: Removed session 23. Jan 15 23:49:10.882466 kubelet[2882]: E0115 23:49:10.881787 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:49:15.752332 systemd[1]: Started sshd@26-10.0.10.219:22-68.220.241.50:50798.service - OpenSSH per-connection server daemon (68.220.241.50:50798). Jan 15 23:49:16.361055 sshd[5563]: Accepted publickey for core from 68.220.241.50 port 50798 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:49:16.362557 sshd-session[5563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:49:16.366498 systemd-logind[1599]: New session 24 of user core. Jan 15 23:49:16.372675 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 15 23:49:16.857048 sshd[5568]: Connection closed by 68.220.241.50 port 50798 Jan 15 23:49:16.857651 sshd-session[5563]: pam_unix(sshd:session): session closed for user core Jan 15 23:49:16.861316 systemd[1]: sshd@26-10.0.10.219:22-68.220.241.50:50798.service: Deactivated successfully. Jan 15 23:49:16.865619 systemd[1]: session-24.scope: Deactivated successfully. Jan 15 23:49:16.866634 systemd-logind[1599]: Session 24 logged out. Waiting for processes to exit. Jan 15 23:49:16.868999 systemd-logind[1599]: Removed session 24. Jan 15 23:49:16.882148 kubelet[2882]: E0115 23:49:16.882082 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:49:17.882054 kubelet[2882]: E0115 23:49:17.881993 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:49:18.881145 kubelet[2882]: E0115 23:49:18.881103 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:49:20.882012 kubelet[2882]: E0115 23:49:20.881952 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:49:21.968634 systemd[1]: Started sshd@27-10.0.10.219:22-68.220.241.50:50812.service - OpenSSH per-connection server daemon (68.220.241.50:50812). Jan 15 23:49:22.597498 sshd[5581]: Accepted publickey for core from 68.220.241.50 port 50812 ssh2: RSA SHA256:mlFwV2mNhWAX9miLlDaAVOccp1AYIG1i1y9cTR0vub4 Jan 15 23:49:22.599153 sshd-session[5581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 23:49:22.603234 systemd-logind[1599]: New session 25 of user core. Jan 15 23:49:22.615621 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 15 23:49:22.881867 kubelet[2882]: E0115 23:49:22.881748 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:49:23.089084 sshd[5584]: Connection closed by 68.220.241.50 port 50812 Jan 15 23:49:23.091645 sshd-session[5581]: pam_unix(sshd:session): session closed for user core Jan 15 23:49:23.094675 systemd[1]: sshd@27-10.0.10.219:22-68.220.241.50:50812.service: Deactivated successfully. Jan 15 23:49:23.096520 systemd[1]: session-25.scope: Deactivated successfully. Jan 15 23:49:23.098632 systemd-logind[1599]: Session 25 logged out. Waiting for processes to exit. Jan 15 23:49:23.099968 systemd-logind[1599]: Removed session 25. Jan 15 23:49:23.884698 kubelet[2882]: E0115 23:49:23.884655 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:49:28.881750 kubelet[2882]: E0115 23:49:28.881661 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:49:29.882012 kubelet[2882]: E0115 23:49:29.881922 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:49:29.883628 kubelet[2882]: E0115 23:49:29.883583 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:49:33.885076 kubelet[2882]: E0115 23:49:33.884882 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:49:33.885460 kubelet[2882]: E0115 23:49:33.885325 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:49:38.882029 kubelet[2882]: E0115 23:49:38.881947 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:49:40.881350 kubelet[2882]: E0115 23:49:40.881306 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:49:41.609904 systemd[1]: Started sshd@28-10.0.10.219:22-209.38.35.85:47888.service - OpenSSH per-connection server daemon (209.38.35.85:47888). Jan 15 23:49:41.888583 sshd[5627]: Connection closed by authenticating user root 209.38.35.85 port 47888 [preauth] Jan 15 23:49:41.891120 systemd[1]: sshd@28-10.0.10.219:22-209.38.35.85:47888.service: Deactivated successfully. Jan 15 23:49:43.882010 kubelet[2882]: E0115 23:49:43.881961 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:49:44.882315 kubelet[2882]: E0115 23:49:44.882256 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:49:46.881662 kubelet[2882]: E0115 23:49:46.881571 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:49:48.842365 kubelet[2882]: E0115 23:49:48.842313 2882 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.219:55860->10.0.10.151:2379: read: connection timed out" Jan 15 23:49:48.845708 systemd[1]: cri-containerd-0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca.scope: Deactivated successfully. Jan 15 23:49:48.846019 systemd[1]: cri-containerd-0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca.scope: Consumed 3.955s CPU time, 23.5M memory peak. Jan 15 23:49:48.847210 containerd[1623]: time="2026-01-15T23:49:48.847110724Z" level=info msg="received container exit event container_id:\"0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca\" id:\"0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca\" pid:2751 exit_status:1 exited_at:{seconds:1768520988 nanos:846810842}" Jan 15 23:49:48.867990 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca-rootfs.mount: Deactivated successfully. Jan 15 23:49:48.881974 kubelet[2882]: E0115 23:49:48.881927 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69897db5bb-bgndc" podUID="fc5a246d-dfc3-43e7-b2c7-409da1aecc92" Jan 15 23:49:49.131005 systemd[1]: cri-containerd-8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2.scope: Deactivated successfully. Jan 15 23:49:49.131460 systemd[1]: cri-containerd-8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2.scope: Consumed 4.689s CPU time, 63.6M memory peak. Jan 15 23:49:49.132452 containerd[1623]: time="2026-01-15T23:49:49.132335421Z" level=info msg="received container exit event container_id:\"8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2\" id:\"8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2\" pid:2742 exit_status:1 exited_at:{seconds:1768520989 nanos:132053980}" Jan 15 23:49:49.152232 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2-rootfs.mount: Deactivated successfully. Jan 15 23:49:49.237864 systemd[1]: cri-containerd-970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200.scope: Deactivated successfully. Jan 15 23:49:49.238143 systemd[1]: cri-containerd-970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200.scope: Consumed 35.643s CPU time, 124M memory peak. Jan 15 23:49:49.240462 containerd[1623]: time="2026-01-15T23:49:49.240230983Z" level=info msg="received container exit event container_id:\"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\" id:\"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\" pid:3215 exit_status:1 exited_at:{seconds:1768520989 nanos:239998142}" Jan 15 23:49:49.258951 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200-rootfs.mount: Deactivated successfully. Jan 15 23:49:49.397853 kubelet[2882]: I0115 23:49:49.397428 2882 scope.go:117] "RemoveContainer" containerID="970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200" Jan 15 23:49:49.400371 kubelet[2882]: I0115 23:49:49.400174 2882 scope.go:117] "RemoveContainer" containerID="8203c8f4ef4ef3992d2ff08c8245827de921d85365ba7aa78ec2a6091f205ad2" Jan 15 23:49:49.402285 containerd[1623]: time="2026-01-15T23:49:49.402204685Z" level=info msg="CreateContainer within sandbox \"9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 15 23:49:49.402460 kubelet[2882]: I0115 23:49:49.402393 2882 scope.go:117] "RemoveContainer" containerID="0f9d56bac994efa6b74213874400b108aefd8683233dad233862d9b06658faca" Jan 15 23:49:49.403470 containerd[1623]: time="2026-01-15T23:49:49.402840608Z" level=info msg="CreateContainer within sandbox \"afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 15 23:49:49.404418 containerd[1623]: time="2026-01-15T23:49:49.404330015Z" level=info msg="CreateContainer within sandbox \"f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 15 23:49:49.413208 containerd[1623]: time="2026-01-15T23:49:49.412811296Z" level=info msg="Container ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:49:49.421506 containerd[1623]: time="2026-01-15T23:49:49.421467338Z" level=info msg="Container 33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:49:49.425328 containerd[1623]: time="2026-01-15T23:49:49.425199436Z" level=info msg="CreateContainer within sandbox \"afb65219f234113bd7429606d51b1f08aff828cadbc7363977057e6b6df437b2\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e\"" Jan 15 23:49:49.425755 containerd[1623]: time="2026-01-15T23:49:49.425728239Z" level=info msg="StartContainer for \"ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e\"" Jan 15 23:49:49.426541 containerd[1623]: time="2026-01-15T23:49:49.426510003Z" level=info msg="Container b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180: CDI devices from CRI Config.CDIDevices: []" Jan 15 23:49:49.426710 containerd[1623]: time="2026-01-15T23:49:49.426686243Z" level=info msg="connecting to shim ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e" address="unix:///run/containerd/s/c7339ee98f994329a9d7949d11eabc95d60e15ee8fd216ab9e0afa6e840de808" protocol=ttrpc version=3 Jan 15 23:49:49.433853 containerd[1623]: time="2026-01-15T23:49:49.433799558Z" level=info msg="CreateContainer within sandbox \"9a6ff961469674d6fd2cafb8d257a5a48d446c81d56d13467f4c89150b9c2559\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326\"" Jan 15 23:49:49.434708 containerd[1623]: time="2026-01-15T23:49:49.434677322Z" level=info msg="StartContainer for \"33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326\"" Jan 15 23:49:49.436444 containerd[1623]: time="2026-01-15T23:49:49.436407930Z" level=info msg="connecting to shim 33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326" address="unix:///run/containerd/s/0daec9c26f6c15595453afc18e5b2e80a23c2a9d76994d50e02e87c3d6d5450c" protocol=ttrpc version=3 Jan 15 23:49:49.438468 containerd[1623]: time="2026-01-15T23:49:49.438350500Z" level=info msg="CreateContainer within sandbox \"f6bd08ce8c96c11de041647f0e6549be3ffb994cffd89616d40b6975c7904771\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180\"" Jan 15 23:49:49.440326 containerd[1623]: time="2026-01-15T23:49:49.440297229Z" level=info msg="StartContainer for \"b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180\"" Jan 15 23:49:49.442381 containerd[1623]: time="2026-01-15T23:49:49.442290599Z" level=info msg="connecting to shim b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180" address="unix:///run/containerd/s/f525541d0296420e40efa1381707f676935a79cf34336b1391fce3123491a4e2" protocol=ttrpc version=3 Jan 15 23:49:49.446650 systemd[1]: Started cri-containerd-ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e.scope - libcontainer container ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e. Jan 15 23:49:49.460621 systemd[1]: Started cri-containerd-33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326.scope - libcontainer container 33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326. Jan 15 23:49:49.463550 systemd[1]: Started cri-containerd-b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180.scope - libcontainer container b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180. Jan 15 23:49:49.495072 containerd[1623]: time="2026-01-15T23:49:49.495029894Z" level=info msg="StartContainer for \"ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e\" returns successfully" Jan 15 23:49:49.511007 containerd[1623]: time="2026-01-15T23:49:49.510960851Z" level=info msg="StartContainer for \"33f933fce4d57c2a10a822c8a8aebd78e160aee2708d91e1a44ca0df2b5a8326\" returns successfully" Jan 15 23:49:49.522739 containerd[1623]: time="2026-01-15T23:49:49.522670067Z" level=info msg="StartContainer for \"b0f8aa9b4448ea2d57d8294eaacbcfafc0c8ad48667a24df63f40a5f3a443180\" returns successfully" Jan 15 23:49:51.337843 kubelet[2882]: E0115 23:49:51.337716 2882 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.219:55684->10.0.10.151:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-76998f65d4-hpm9k.188b0c514e7131ab calico-apiserver 1727 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-76998f65d4-hpm9k,UID:ff12f75f-0f47-4e03-a069-ef3f612b51b0,APIVersion:v1,ResourceVersion:805,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-b7ec270451,},FirstTimestamp:2026-01-15 23:47:02 +0000 UTC,LastTimestamp:2026-01-15 23:49:40.881243641 +0000 UTC m=+205.081238986,Count:11,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-b7ec270451,}" Jan 15 23:49:51.881624 kubelet[2882]: E0115 23:49:51.881517 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-hpm9k" podUID="ff12f75f-0f47-4e03-a069-ef3f612b51b0" Jan 15 23:49:51.882383 kubelet[2882]: E0115 23:49:51.882087 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-zqjh8" podUID="fee8d1af-3972-419e-8500-84b3b6b46b71" Jan 15 23:49:57.673463 kubelet[2882]: I0115 23:49:57.673402 2882 status_manager.go:895] "Failed to get status for pod" podUID="6485e0fd-2146-4cba-a391-bc44d6a007bf" pod="tigera-operator/tigera-operator-7dcd859c48-mvm85" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.219:55770->10.0.10.151:2379: read: connection timed out" Jan 15 23:49:57.882183 containerd[1623]: time="2026-01-15T23:49:57.882130972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 23:49:58.217714 containerd[1623]: time="2026-01-15T23:49:58.217661873Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:49:58.219100 containerd[1623]: time="2026-01-15T23:49:58.219016879Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 23:49:58.219100 containerd[1623]: time="2026-01-15T23:49:58.219071919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 15 23:49:58.219253 kubelet[2882]: E0115 23:49:58.219211 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:49:58.219302 kubelet[2882]: E0115 23:49:58.219259 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 23:49:58.219560 kubelet[2882]: E0115 23:49:58.219501 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9n5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-t2blj_calico-system(c476e4d7-47c4-4b45-afc2-049d681292b9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 23:49:58.219832 containerd[1623]: time="2026-01-15T23:49:58.219541722Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 23:49:58.220851 kubelet[2882]: E0115 23:49:58.220820 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-t2blj" podUID="c476e4d7-47c4-4b45-afc2-049d681292b9" Jan 15 23:49:58.556168 containerd[1623]: time="2026-01-15T23:49:58.555983067Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:49:58.557692 containerd[1623]: time="2026-01-15T23:49:58.557632075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 23:49:58.557749 containerd[1623]: time="2026-01-15T23:49:58.557641075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 15 23:49:58.557908 kubelet[2882]: E0115 23:49:58.557857 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:49:58.557908 kubelet[2882]: E0115 23:49:58.557904 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 23:49:58.558046 kubelet[2882]: E0115 23:49:58.558013 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:3d31bd98c0c242218cb5129b8a984197,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 23:49:58.559853 containerd[1623]: time="2026-01-15T23:49:58.559826006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 23:49:58.843920 kubelet[2882]: E0115 23:49:58.843760 2882 controller.go:195] "Failed to update lease" err="Put \"https://10.0.10.219:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-b7ec270451?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 15 23:49:58.883391 containerd[1623]: time="2026-01-15T23:49:58.883356248Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:49:58.885044 containerd[1623]: time="2026-01-15T23:49:58.885007016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 23:49:58.885118 containerd[1623]: time="2026-01-15T23:49:58.885076897Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 15 23:49:58.885385 kubelet[2882]: E0115 23:49:58.885237 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:49:58.885385 kubelet[2882]: E0115 23:49:58.885283 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 23:49:58.885547 kubelet[2882]: E0115 23:49:58.885493 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dslx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6d7d4c4d95-4t7sh_calico-system(7d889576-5f17-49b8-be14-3f0e7bea06cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 23:49:58.885700 containerd[1623]: time="2026-01-15T23:49:58.885676620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 23:49:58.887040 kubelet[2882]: E0115 23:49:58.887003 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6d7d4c4d95-4t7sh" podUID="7d889576-5f17-49b8-be14-3f0e7bea06cf" Jan 15 23:49:59.222618 containerd[1623]: time="2026-01-15T23:49:59.222527087Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 23:49:59.224296 containerd[1623]: time="2026-01-15T23:49:59.224249695Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 23:49:59.224391 containerd[1623]: time="2026-01-15T23:49:59.224336896Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 15 23:49:59.224549 kubelet[2882]: E0115 23:49:59.224492 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:49:59.224609 kubelet[2882]: E0115 23:49:59.224553 2882 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 23:49:59.224775 kubelet[2882]: E0115 23:49:59.224704 2882 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l7vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76998f65d4-lbk6r_calico-apiserver(717a4ddd-81e7-49b2-b875-208b71de27d1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 23:49:59.225928 kubelet[2882]: E0115 23:49:59.225880 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76998f65d4-lbk6r" podUID="717a4ddd-81e7-49b2-b875-208b71de27d1" Jan 15 23:50:00.698537 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 15 23:50:00.728110 systemd[1]: cri-containerd-ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e.scope: Deactivated successfully. Jan 15 23:50:00.728862 containerd[1623]: time="2026-01-15T23:50:00.728695803Z" level=info msg="received container exit event container_id:\"ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e\" id:\"ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e\" pid:5708 exit_status:1 exited_at:{seconds:1768521000 nanos:728516682}" Jan 15 23:50:00.746529 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e-rootfs.mount: Deactivated successfully. Jan 15 23:50:01.435572 kubelet[2882]: I0115 23:50:01.435495 2882 scope.go:117] "RemoveContainer" containerID="970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200" Jan 15 23:50:01.436117 kubelet[2882]: I0115 23:50:01.435801 2882 scope.go:117] "RemoveContainer" containerID="ea1294d582371c166ae891bacf24b9be251b5613772c14e90857f36c0396e27e" Jan 15 23:50:01.436117 kubelet[2882]: E0115 23:50:01.435972 2882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-mvm85_tigera-operator(6485e0fd-2146-4cba-a391-bc44d6a007bf)\"" pod="tigera-operator/tigera-operator-7dcd859c48-mvm85" podUID="6485e0fd-2146-4cba-a391-bc44d6a007bf" Jan 15 23:50:01.437735 containerd[1623]: time="2026-01-15T23:50:01.437699948Z" level=info msg="RemoveContainer for \"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\"" Jan 15 23:50:01.443935 containerd[1623]: time="2026-01-15T23:50:01.443599377Z" level=info msg="RemoveContainer for \"970be65819e8ebfc4459cf48a5283429a0fd2540ce1bee431319cb79977e2200\" returns successfully" Jan 15 23:50:03.884317 containerd[1623]: time="2026-01-15T23:50:03.884281488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\""