Dec 12 17:24:28.789011 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 12 17:24:28.789031 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:24:28.789041 kernel: KASLR enabled Dec 12 17:24:28.789047 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Dec 12 17:24:28.789052 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Dec 12 17:24:28.789058 kernel: random: crng init done Dec 12 17:24:28.789064 kernel: secureboot: Secure boot disabled Dec 12 17:24:28.789070 kernel: ACPI: Early table checksum verification disabled Dec 12 17:24:28.789076 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Dec 12 17:24:28.789081 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 12 17:24:28.789088 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789094 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789100 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789105 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789112 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789119 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789126 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789132 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789138 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:28.789144 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 17:24:28.789150 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 12 17:24:28.789156 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:24:28.789162 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 12 17:24:28.789168 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Dec 12 17:24:28.789174 kernel: Zone ranges: Dec 12 17:24:28.789179 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 12 17:24:28.789193 kernel: DMA32 empty Dec 12 17:24:28.789200 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 12 17:24:28.789207 kernel: Device empty Dec 12 17:24:28.789212 kernel: Movable zone start for each node Dec 12 17:24:28.789218 kernel: Early memory node ranges Dec 12 17:24:28.789224 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Dec 12 17:24:28.789230 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Dec 12 17:24:28.789236 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Dec 12 17:24:28.789242 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Dec 12 17:24:28.789248 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Dec 12 17:24:28.789254 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Dec 12 17:24:28.789260 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Dec 12 17:24:28.789268 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Dec 12 17:24:28.789274 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Dec 12 17:24:28.789284 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 12 17:24:28.789293 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 12 17:24:28.789299 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 12 17:24:28.789307 kernel: psci: probing for conduit method from ACPI. Dec 12 17:24:28.789314 kernel: psci: PSCIv1.1 detected in firmware. Dec 12 17:24:28.789320 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:24:28.789326 kernel: psci: Trusted OS migration not required Dec 12 17:24:28.789333 kernel: psci: SMC Calling Convention v1.1 Dec 12 17:24:28.789339 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 12 17:24:28.789346 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:24:28.789352 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:24:28.789359 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 12 17:24:28.789368 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:24:28.789375 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:24:28.789382 kernel: CPU features: detected: Spectre-v4 Dec 12 17:24:28.789388 kernel: CPU features: detected: Spectre-BHB Dec 12 17:24:28.789395 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:24:28.789401 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:24:28.789407 kernel: CPU features: detected: ARM erratum 1418040 Dec 12 17:24:28.789414 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:24:28.789420 kernel: alternatives: applying boot alternatives Dec 12 17:24:28.789427 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:28.789434 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 12 17:24:28.789440 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:24:28.789447 kernel: Fallback order for Node 0: 0 Dec 12 17:24:28.789455 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Dec 12 17:24:28.789461 kernel: Policy zone: Normal Dec 12 17:24:28.789468 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:24:28.789474 kernel: software IO TLB: area num 2. Dec 12 17:24:28.789480 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 12 17:24:28.789487 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 12 17:24:28.789493 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:24:28.789500 kernel: rcu: RCU event tracing is enabled. Dec 12 17:24:28.789506 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 12 17:24:28.789513 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:24:28.789519 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:24:28.789526 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:24:28.789534 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 12 17:24:28.789540 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:24:28.789547 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:24:28.789553 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:24:28.789560 kernel: GICv3: 256 SPIs implemented Dec 12 17:24:28.789569 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:24:28.789575 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:24:28.789611 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 12 17:24:28.789618 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 12 17:24:28.789625 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 12 17:24:28.789631 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 12 17:24:28.789640 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Dec 12 17:24:28.789647 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Dec 12 17:24:28.789653 kernel: GICv3: using LPI property table @0x0000000100120000 Dec 12 17:24:28.789660 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Dec 12 17:24:28.789666 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:24:28.789673 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:28.789680 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 12 17:24:28.789686 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 12 17:24:28.789693 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 12 17:24:28.789699 kernel: Console: colour dummy device 80x25 Dec 12 17:24:28.789706 kernel: ACPI: Core revision 20240827 Dec 12 17:24:28.789714 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 12 17:24:28.789721 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:24:28.789728 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:24:28.789734 kernel: landlock: Up and running. Dec 12 17:24:28.789741 kernel: SELinux: Initializing. Dec 12 17:24:28.789747 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:24:28.789754 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:24:28.789760 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:24:28.789767 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:24:28.789775 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:24:28.789782 kernel: Remapping and enabling EFI services. Dec 12 17:24:28.789788 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:24:28.789795 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:24:28.789804 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 12 17:24:28.789811 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Dec 12 17:24:28.789818 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:28.789825 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 12 17:24:28.789831 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 17:24:28.789838 kernel: SMP: Total of 2 processors activated. Dec 12 17:24:28.789851 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:24:28.789858 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:24:28.792505 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:24:28.792514 kernel: CPU features: detected: Common not Private translations Dec 12 17:24:28.792521 kernel: CPU features: detected: CRC32 instructions Dec 12 17:24:28.792529 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 12 17:24:28.792536 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:24:28.792546 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:24:28.792553 kernel: CPU features: detected: Privileged Access Never Dec 12 17:24:28.792559 kernel: CPU features: detected: RAS Extension Support Dec 12 17:24:28.792567 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:24:28.792574 kernel: alternatives: applying system-wide alternatives Dec 12 17:24:28.792596 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 12 17:24:28.792605 kernel: Memory: 3858852K/4096000K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 215668K reserved, 16384K cma-reserved) Dec 12 17:24:28.792612 kernel: devtmpfs: initialized Dec 12 17:24:28.792624 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:24:28.792634 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 12 17:24:28.792641 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:24:28.792651 kernel: 0 pages in range for non-PLT usage Dec 12 17:24:28.792658 kernel: 508400 pages in range for PLT usage Dec 12 17:24:28.792665 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:24:28.792672 kernel: SMBIOS 3.0.0 present. Dec 12 17:24:28.792679 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 12 17:24:28.792686 kernel: DMI: Memory slots populated: 1/1 Dec 12 17:24:28.792693 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:24:28.792702 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:24:28.792709 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:24:28.792716 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:24:28.792723 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:24:28.792730 kernel: audit: type=2000 audit(0.019:1): state=initialized audit_enabled=0 res=1 Dec 12 17:24:28.792737 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:24:28.792744 kernel: cpuidle: using governor menu Dec 12 17:24:28.792754 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:24:28.792761 kernel: ASID allocator initialised with 32768 entries Dec 12 17:24:28.792770 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:24:28.792777 kernel: Serial: AMBA PL011 UART driver Dec 12 17:24:28.792784 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:24:28.792790 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:24:28.792797 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:24:28.792804 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:24:28.792811 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:24:28.792818 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:24:28.792825 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:24:28.792834 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:24:28.792841 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:24:28.792848 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:24:28.792878 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:24:28.792887 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:24:28.792894 kernel: ACPI: Interpreter enabled Dec 12 17:24:28.792901 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:24:28.792908 kernel: ACPI: MCFG table detected, 1 entries Dec 12 17:24:28.792915 kernel: ACPI: CPU0 has been hot-added Dec 12 17:24:28.792924 kernel: ACPI: CPU1 has been hot-added Dec 12 17:24:28.792935 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:24:28.792942 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:24:28.792949 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 17:24:28.793092 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 17:24:28.793156 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 17:24:28.793215 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 17:24:28.793272 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 12 17:24:28.793332 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 12 17:24:28.793341 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 12 17:24:28.793348 kernel: PCI host bridge to bus 0000:00 Dec 12 17:24:28.793419 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 12 17:24:28.793473 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 12 17:24:28.793526 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:28.793596 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 17:24:28.793679 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 12 17:24:28.793754 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Dec 12 17:24:28.793821 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Dec 12 17:24:28.793979 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Dec 12 17:24:28.794079 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.794149 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Dec 12 17:24:28.794218 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 12 17:24:28.794278 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:28.794343 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 12 17:24:28.794410 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.794482 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Dec 12 17:24:28.796186 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 12 17:24:28.796261 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:28.796351 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.796420 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Dec 12 17:24:28.796490 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 12 17:24:28.796550 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:28.796625 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 12 17:24:28.796707 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.796768 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Dec 12 17:24:28.796831 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 12 17:24:28.796901 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:28.796962 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 12 17:24:28.797043 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.797105 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Dec 12 17:24:28.797165 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 12 17:24:28.797224 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:28.797286 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 12 17:24:28.797358 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.797425 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Dec 12 17:24:28.797490 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 12 17:24:28.797549 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:28.797649 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 12 17:24:28.797722 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.797787 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Dec 12 17:24:28.797846 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 12 17:24:28.799073 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:28.799161 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Dec 12 17:24:28.799246 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.799308 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Dec 12 17:24:28.799373 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 12 17:24:28.799433 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:28.799517 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:28.799620 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Dec 12 17:24:28.799689 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 12 17:24:28.799748 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:28.799824 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Dec 12 17:24:28.799967 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Dec 12 17:24:28.800067 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:24:28.800180 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Dec 12 17:24:28.800254 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 12 17:24:28.800343 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:24:28.800446 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 17:24:28.800516 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Dec 12 17:24:28.800605 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 12 17:24:28.800680 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Dec 12 17:24:28.800753 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 12 17:24:28.800827 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:28.801355 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 12 17:24:28.801446 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:28.801517 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Dec 12 17:24:28.801625 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 12 17:24:28.801720 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 12 17:24:28.801787 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Dec 12 17:24:28.801856 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 12 17:24:28.802016 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:24:28.802118 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Dec 12 17:24:28.802201 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Dec 12 17:24:28.802270 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:24:28.802359 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 12 17:24:28.802434 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:28.802496 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:28.802568 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 12 17:24:28.802656 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 12 17:24:28.802767 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 12 17:24:28.802832 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 17:24:28.802904 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:28.804044 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:28.804134 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 17:24:28.804196 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 12 17:24:28.804291 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 12 17:24:28.804367 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 17:24:28.804442 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:28.804508 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:28.804576 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 17:24:28.804689 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:28.804758 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:28.804825 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 17:24:28.804902 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 12 17:24:28.804962 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 12 17:24:28.805026 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 17:24:28.805084 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:28.805143 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:28.805213 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 17:24:28.805282 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:28.805347 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:28.805413 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 12 17:24:28.805479 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 12 17:24:28.805540 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 12 17:24:28.805611 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 12 17:24:28.805672 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 12 17:24:28.805734 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 12 17:24:28.805794 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 12 17:24:28.805852 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 12 17:24:28.806817 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 12 17:24:28.807956 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 12 17:24:28.808032 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 12 17:24:28.808094 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 12 17:24:28.808171 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 12 17:24:28.808245 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 12 17:24:28.808307 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 12 17:24:28.808374 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 12 17:24:28.808443 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 12 17:24:28.808507 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 12 17:24:28.808577 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Dec 12 17:24:28.808652 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Dec 12 17:24:28.808722 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Dec 12 17:24:28.808791 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:24:28.808858 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Dec 12 17:24:28.808936 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:24:28.808999 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Dec 12 17:24:28.809070 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:24:28.809132 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Dec 12 17:24:28.809192 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:24:28.809256 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Dec 12 17:24:28.809324 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:24:28.809388 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Dec 12 17:24:28.809447 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:24:28.809505 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Dec 12 17:24:28.809574 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:24:28.809680 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Dec 12 17:24:28.809741 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:24:28.809801 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Dec 12 17:24:28.810888 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:24:28.811008 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Dec 12 17:24:28.811105 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 12 17:24:28.811193 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 12 17:24:28.811267 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 12 17:24:28.811329 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 12 17:24:28.811403 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 12 17:24:28.811479 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:28.811541 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:28.811652 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 12 17:24:28.812983 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 12 17:24:28.813094 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 12 17:24:28.813166 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:28.813240 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:28.813321 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 12 17:24:28.813413 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 12 17:24:28.813486 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 12 17:24:28.813569 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 12 17:24:28.813685 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:28.813761 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:28.813845 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 12 17:24:28.813936 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 12 17:24:28.814047 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 12 17:24:28.814132 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:28.814200 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:28.814291 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 12 17:24:28.814368 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 12 17:24:28.814435 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 12 17:24:28.814501 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 12 17:24:28.814972 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:28.815056 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:28.815135 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 12 17:24:28.815210 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 12 17:24:28.815284 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 12 17:24:28.815407 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 12 17:24:28.815490 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:28.815558 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:28.815715 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Dec 12 17:24:28.815798 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Dec 12 17:24:28.817545 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Dec 12 17:24:28.817734 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 12 17:24:28.817807 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 12 17:24:28.817891 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:28.817959 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:28.818024 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 12 17:24:28.818091 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 12 17:24:28.818174 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:28.818251 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:28.818344 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 12 17:24:28.818425 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 12 17:24:28.818501 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:28.818597 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:28.818673 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 12 17:24:28.818741 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 12 17:24:28.818832 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:28.819801 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 12 17:24:28.819949 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 12 17:24:28.820032 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:28.820284 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 12 17:24:28.820363 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 12 17:24:28.820427 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:28.820504 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 12 17:24:28.820572 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 12 17:24:28.820697 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:28.820767 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 12 17:24:28.820830 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 12 17:24:28.821461 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:28.821550 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 12 17:24:28.821639 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 12 17:24:28.821713 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:28.821786 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 12 17:24:28.821858 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 12 17:24:28.821994 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:28.822069 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 12 17:24:28.822132 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 12 17:24:28.822201 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:28.822276 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 12 17:24:28.822351 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 12 17:24:28.822415 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:28.822482 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 12 17:24:28.822554 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 12 17:24:28.822685 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:28.822698 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 12 17:24:28.822707 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 12 17:24:28.822718 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 12 17:24:28.822725 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 12 17:24:28.822733 kernel: iommu: Default domain type: Translated Dec 12 17:24:28.822740 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:24:28.822750 kernel: efivars: Registered efivars operations Dec 12 17:24:28.822759 kernel: vgaarb: loaded Dec 12 17:24:28.822766 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:24:28.822775 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:24:28.822784 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:24:28.822795 kernel: pnp: PnP ACPI init Dec 12 17:24:28.822902 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 12 17:24:28.822915 kernel: pnp: PnP ACPI: found 1 devices Dec 12 17:24:28.822923 kernel: NET: Registered PF_INET protocol family Dec 12 17:24:28.822930 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 12 17:24:28.822938 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 12 17:24:28.822945 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:24:28.822953 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:24:28.822963 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 12 17:24:28.822971 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 12 17:24:28.822981 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:24:28.822989 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:24:28.822998 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:24:28.823090 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 12 17:24:28.823103 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:24:28.823112 kernel: kvm [1]: HYP mode not available Dec 12 17:24:28.823140 kernel: Initialise system trusted keyrings Dec 12 17:24:28.823151 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 12 17:24:28.823159 kernel: Key type asymmetric registered Dec 12 17:24:28.823166 kernel: Asymmetric key parser 'x509' registered Dec 12 17:24:28.823174 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:24:28.823181 kernel: io scheduler mq-deadline registered Dec 12 17:24:28.823189 kernel: io scheduler kyber registered Dec 12 17:24:28.823197 kernel: io scheduler bfq registered Dec 12 17:24:28.823205 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 12 17:24:28.823287 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 12 17:24:28.823372 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 12 17:24:28.823446 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.823510 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 12 17:24:28.823573 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 12 17:24:28.823651 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.823719 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 12 17:24:28.823782 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 12 17:24:28.823844 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.823943 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 12 17:24:28.824006 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 12 17:24:28.824068 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.824131 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 12 17:24:28.824191 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 12 17:24:28.824249 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.824313 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 12 17:24:28.824372 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 12 17:24:28.824434 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.824504 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 12 17:24:28.824595 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 12 17:24:28.824666 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.824740 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 12 17:24:28.824815 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 12 17:24:28.824906 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.824921 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 12 17:24:28.825018 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 12 17:24:28.825091 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 12 17:24:28.825155 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:28.825167 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 12 17:24:28.825176 kernel: ACPI: button: Power Button [PWRB] Dec 12 17:24:28.825188 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 12 17:24:28.825266 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 12 17:24:28.825342 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 12 17:24:28.825358 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:24:28.825366 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 12 17:24:28.825434 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 12 17:24:28.825447 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 12 17:24:28.825455 kernel: thunder_xcv, ver 1.0 Dec 12 17:24:28.825465 kernel: thunder_bgx, ver 1.0 Dec 12 17:24:28.825473 kernel: nicpf, ver 1.0 Dec 12 17:24:28.825481 kernel: nicvf, ver 1.0 Dec 12 17:24:28.825565 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:24:28.825680 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:24:28 UTC (1765560268) Dec 12 17:24:28.825693 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:24:28.825701 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:24:28.825708 kernel: watchdog: NMI not fully supported Dec 12 17:24:28.825716 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:24:28.825723 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:24:28.825730 kernel: Segment Routing with IPv6 Dec 12 17:24:28.825741 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:24:28.825752 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:24:28.825759 kernel: Key type dns_resolver registered Dec 12 17:24:28.825767 kernel: registered taskstats version 1 Dec 12 17:24:28.825777 kernel: Loading compiled-in X.509 certificates Dec 12 17:24:28.825786 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:24:28.825794 kernel: Demotion targets for Node 0: null Dec 12 17:24:28.825803 kernel: Key type .fscrypt registered Dec 12 17:24:28.825812 kernel: Key type fscrypt-provisioning registered Dec 12 17:24:28.825820 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:24:28.825829 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:24:28.825836 kernel: ima: No architecture policies found Dec 12 17:24:28.825844 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:24:28.825851 kernel: clk: Disabling unused clocks Dec 12 17:24:28.825890 kernel: PM: genpd: Disabling unused power domains Dec 12 17:24:28.825903 kernel: Warning: unable to open an initial console. Dec 12 17:24:28.825920 kernel: Freeing unused kernel memory: 39552K Dec 12 17:24:28.825928 kernel: Run /init as init process Dec 12 17:24:28.825936 kernel: with arguments: Dec 12 17:24:28.825946 kernel: /init Dec 12 17:24:28.825953 kernel: with environment: Dec 12 17:24:28.825960 kernel: HOME=/ Dec 12 17:24:28.825967 kernel: TERM=linux Dec 12 17:24:28.825976 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:24:28.825987 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:28.825995 systemd[1]: Detected virtualization kvm. Dec 12 17:24:28.826007 systemd[1]: Detected architecture arm64. Dec 12 17:24:28.826016 systemd[1]: Running in initrd. Dec 12 17:24:28.826025 systemd[1]: No hostname configured, using default hostname. Dec 12 17:24:28.826035 systemd[1]: Hostname set to . Dec 12 17:24:28.826044 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:28.826052 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:24:28.826063 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:28.826071 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:28.826083 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:24:28.826093 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:28.826102 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:24:28.826112 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:24:28.826122 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:24:28.826130 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:24:28.826138 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:28.826147 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:28.826155 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:28.826163 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:28.826171 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:28.826178 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:28.826186 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:28.826194 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:28.826204 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:24:28.826211 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:24:28.826223 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:28.826232 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:28.826242 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:28.826252 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:28.826261 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:24:28.826270 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:28.826280 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:24:28.826288 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:24:28.826298 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:24:28.826305 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:28.826313 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:28.826321 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:28.826329 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:28.826367 systemd-journald[245]: Collecting audit messages is disabled. Dec 12 17:24:28.826393 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:28.826402 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:24:28.826411 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:24:28.826420 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:24:28.826428 kernel: Bridge firewalling registered Dec 12 17:24:28.826435 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:28.826446 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:24:28.826456 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:28.826467 systemd-journald[245]: Journal started Dec 12 17:24:28.826489 systemd-journald[245]: Runtime Journal (/run/log/journal/55fc44d825a24f1296a4f844bd663fcd) is 8M, max 76.5M, 68.5M free. Dec 12 17:24:28.789925 systemd-modules-load[247]: Inserted module 'overlay' Dec 12 17:24:28.813911 systemd-modules-load[247]: Inserted module 'br_netfilter' Dec 12 17:24:28.831812 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:28.831856 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:28.837669 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:28.848103 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:24:28.851383 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:28.855997 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:28.858100 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:28.868838 systemd-tmpfiles[268]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:24:28.874495 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:28.879428 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:28.883832 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:28.886742 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:24:28.908897 dracut-cmdline[284]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:28.927398 systemd-resolved[281]: Positive Trust Anchors: Dec 12 17:24:28.927413 systemd-resolved[281]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:28.927444 systemd-resolved[281]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:28.933569 systemd-resolved[281]: Defaulting to hostname 'linux'. Dec 12 17:24:28.935656 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:28.936487 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:29.019905 kernel: SCSI subsystem initialized Dec 12 17:24:29.024899 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:24:29.032325 kernel: iscsi: registered transport (tcp) Dec 12 17:24:29.044949 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:24:29.045048 kernel: QLogic iSCSI HBA Driver Dec 12 17:24:29.067633 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:29.090019 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:29.093803 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:29.147251 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:29.149665 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:24:29.223918 kernel: raid6: neonx8 gen() 15665 MB/s Dec 12 17:24:29.237923 kernel: raid6: neonx4 gen() 15776 MB/s Dec 12 17:24:29.254994 kernel: raid6: neonx2 gen() 13144 MB/s Dec 12 17:24:29.271941 kernel: raid6: neonx1 gen() 10374 MB/s Dec 12 17:24:29.288934 kernel: raid6: int64x8 gen() 6864 MB/s Dec 12 17:24:29.305918 kernel: raid6: int64x4 gen() 7319 MB/s Dec 12 17:24:29.322921 kernel: raid6: int64x2 gen() 6082 MB/s Dec 12 17:24:29.339915 kernel: raid6: int64x1 gen() 5037 MB/s Dec 12 17:24:29.339991 kernel: raid6: using algorithm neonx4 gen() 15776 MB/s Dec 12 17:24:29.356939 kernel: raid6: .... xor() 12317 MB/s, rmw enabled Dec 12 17:24:29.357022 kernel: raid6: using neon recovery algorithm Dec 12 17:24:29.362091 kernel: xor: measuring software checksum speed Dec 12 17:24:29.362176 kernel: 8regs : 21601 MB/sec Dec 12 17:24:29.362195 kernel: 32regs : 21704 MB/sec Dec 12 17:24:29.362212 kernel: arm64_neon : 28041 MB/sec Dec 12 17:24:29.362920 kernel: xor: using function: arm64_neon (28041 MB/sec) Dec 12 17:24:29.415932 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:24:29.424459 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:29.428500 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:29.459455 systemd-udevd[492]: Using default interface naming scheme 'v255'. Dec 12 17:24:29.463920 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:29.467719 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:24:29.504648 dracut-pre-trigger[499]: rd.md=0: removing MD RAID activation Dec 12 17:24:29.538624 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:29.541522 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:29.610714 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:29.613806 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:24:29.694924 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 12 17:24:29.698468 kernel: scsi host0: Virtio SCSI HBA Dec 12 17:24:29.705965 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 12 17:24:29.706037 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 12 17:24:29.742967 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 12 17:24:29.743402 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 12 17:24:29.743503 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 12 17:24:29.743515 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 12 17:24:29.741146 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:29.741290 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:29.745379 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:29.748171 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:29.766094 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 12 17:24:29.767210 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 12 17:24:29.767374 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 12 17:24:29.767453 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 12 17:24:29.769341 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 12 17:24:29.779157 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 17:24:29.779220 kernel: GPT:17805311 != 80003071 Dec 12 17:24:29.782068 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 17:24:29.782116 kernel: GPT:17805311 != 80003071 Dec 12 17:24:29.782129 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 17:24:29.784184 kernel: ACPI: bus type USB registered Dec 12 17:24:29.784226 kernel: usbcore: registered new interface driver usbfs Dec 12 17:24:29.784881 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:29.784912 kernel: usbcore: registered new interface driver hub Dec 12 17:24:29.785893 kernel: usbcore: registered new device driver usb Dec 12 17:24:29.786877 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 12 17:24:29.794726 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:29.803017 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:29.803204 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 12 17:24:29.806893 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 17:24:29.807088 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:29.807170 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 12 17:24:29.807253 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 12 17:24:29.807893 kernel: hub 1-0:1.0: USB hub found Dec 12 17:24:29.808046 kernel: hub 1-0:1.0: 4 ports detected Dec 12 17:24:29.808916 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 17:24:29.809088 kernel: hub 2-0:1.0: USB hub found Dec 12 17:24:29.809894 kernel: hub 2-0:1.0: 4 ports detected Dec 12 17:24:29.863762 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 12 17:24:29.877642 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 12 17:24:29.885842 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 12 17:24:29.886498 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Dec 12 17:24:29.891146 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:29.904680 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 12 17:24:29.906220 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:29.907463 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:29.908062 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:29.912007 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:24:29.916030 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:24:29.934388 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:29.938502 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:29.938968 disk-uuid[603]: Primary Header is updated. Dec 12 17:24:29.938968 disk-uuid[603]: Secondary Entries is updated. Dec 12 17:24:29.938968 disk-uuid[603]: Secondary Header is updated. Dec 12 17:24:30.045882 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 17:24:30.180098 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 12 17:24:30.180154 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 12 17:24:30.180979 kernel: usbcore: registered new interface driver usbhid Dec 12 17:24:30.181006 kernel: usbhid: USB HID core driver Dec 12 17:24:30.284902 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 12 17:24:30.411887 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 12 17:24:30.464934 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 12 17:24:30.961421 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:30.961967 disk-uuid[612]: The operation has completed successfully. Dec 12 17:24:31.028966 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:24:31.030157 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:24:31.063734 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:24:31.090608 sh[629]: Success Dec 12 17:24:31.105895 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:24:31.105955 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:24:31.105976 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:24:31.117957 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:24:31.175848 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:24:31.180032 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:24:31.190382 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:24:31.201891 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (641) Dec 12 17:24:31.203921 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:24:31.203994 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:31.210098 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 12 17:24:31.210159 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:24:31.210188 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:24:31.212304 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:24:31.214398 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:31.215748 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:24:31.216996 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:24:31.222114 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:24:31.248888 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (672) Dec 12 17:24:31.251213 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:31.251268 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:31.256201 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:31.256270 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:31.256282 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:31.260900 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:31.262269 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:24:31.264622 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:24:31.371856 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:31.376503 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:31.420351 systemd-networkd[816]: lo: Link UP Dec 12 17:24:31.421015 systemd-networkd[816]: lo: Gained carrier Dec 12 17:24:31.421744 ignition[721]: Ignition 2.22.0 Dec 12 17:24:31.422593 systemd-networkd[816]: Enumeration completed Dec 12 17:24:31.421752 ignition[721]: Stage: fetch-offline Dec 12 17:24:31.422773 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:31.421793 ignition[721]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:31.424422 systemd-networkd[816]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:31.421801 ignition[721]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:31.424426 systemd-networkd[816]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:31.421904 ignition[721]: parsed url from cmdline: "" Dec 12 17:24:31.424802 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:31.421908 ignition[721]: no config URL provided Dec 12 17:24:31.425147 systemd-networkd[816]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:31.421913 ignition[721]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:31.425151 systemd-networkd[816]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:31.421920 ignition[721]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:31.426087 systemd-networkd[816]: eth0: Link UP Dec 12 17:24:31.421925 ignition[721]: failed to fetch config: resource requires networking Dec 12 17:24:31.426759 systemd-networkd[816]: eth1: Link UP Dec 12 17:24:31.422164 ignition[721]: Ignition finished successfully Dec 12 17:24:31.427070 systemd-networkd[816]: eth0: Gained carrier Dec 12 17:24:31.427080 systemd-networkd[816]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:31.427829 systemd[1]: Reached target network.target - Network. Dec 12 17:24:31.430439 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:24:31.434634 systemd-networkd[816]: eth1: Gained carrier Dec 12 17:24:31.434647 systemd-networkd[816]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:31.463404 ignition[821]: Ignition 2.22.0 Dec 12 17:24:31.463421 ignition[821]: Stage: fetch Dec 12 17:24:31.463574 ignition[821]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:31.463583 ignition[821]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:31.463659 ignition[821]: parsed url from cmdline: "" Dec 12 17:24:31.463662 ignition[821]: no config URL provided Dec 12 17:24:31.463667 ignition[821]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:31.463674 ignition[821]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:31.463716 ignition[821]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 12 17:24:31.464137 ignition[821]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Dec 12 17:24:31.481978 systemd-networkd[816]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 12 17:24:31.494993 systemd-networkd[816]: eth0: DHCPv4 address 46.224.132.113/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 12 17:24:31.664854 ignition[821]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Dec 12 17:24:31.673023 ignition[821]: GET result: OK Dec 12 17:24:31.673285 ignition[821]: parsing config with SHA512: 1538259b1fb97fb4e32b53341f3ba0a8cb09f2ddc0e7c7c856ddac4d82ef5f3d3928d5d2ed542fa39289567af0923529b8980286a24c6bea8a2c32a3043616b6 Dec 12 17:24:31.682129 unknown[821]: fetched base config from "system" Dec 12 17:24:31.682145 unknown[821]: fetched base config from "system" Dec 12 17:24:31.682488 ignition[821]: fetch: fetch complete Dec 12 17:24:31.682150 unknown[821]: fetched user config from "hetzner" Dec 12 17:24:31.682494 ignition[821]: fetch: fetch passed Dec 12 17:24:31.682562 ignition[821]: Ignition finished successfully Dec 12 17:24:31.685562 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:24:31.688350 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:24:31.722712 ignition[829]: Ignition 2.22.0 Dec 12 17:24:31.722726 ignition[829]: Stage: kargs Dec 12 17:24:31.722909 ignition[829]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:31.722919 ignition[829]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:31.723769 ignition[829]: kargs: kargs passed Dec 12 17:24:31.723815 ignition[829]: Ignition finished successfully Dec 12 17:24:31.727958 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:24:31.730397 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:24:31.767053 ignition[836]: Ignition 2.22.0 Dec 12 17:24:31.767681 ignition[836]: Stage: disks Dec 12 17:24:31.767848 ignition[836]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:31.768825 ignition[836]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:31.769717 ignition[836]: disks: disks passed Dec 12 17:24:31.769775 ignition[836]: Ignition finished successfully Dec 12 17:24:31.772295 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:24:31.773472 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:31.774654 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:24:31.775799 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:31.777741 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:31.779679 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:31.782204 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:24:31.828629 systemd-fsck[844]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 17:24:31.834439 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:24:31.838200 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:24:31.915906 kernel: EXT4-fs (sda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:24:31.916702 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:24:31.918697 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:31.921975 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:31.924056 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:24:31.932982 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 12 17:24:31.937953 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:24:31.941984 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (852) Dec 12 17:24:31.942008 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:31.942018 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:31.938045 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:31.944944 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:24:31.946996 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:24:31.955598 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:31.955653 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:31.957029 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:31.965706 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:32.003503 coreos-metadata[854]: Dec 12 17:24:32.003 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 12 17:24:32.004950 coreos-metadata[854]: Dec 12 17:24:32.004 INFO Fetch successful Dec 12 17:24:32.006012 coreos-metadata[854]: Dec 12 17:24:32.005 INFO wrote hostname ci-4459-2-2-0-24adfa6772 to /sysroot/etc/hostname Dec 12 17:24:32.008377 initrd-setup-root[879]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:24:32.011426 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:24:32.016888 initrd-setup-root[887]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:24:32.023053 initrd-setup-root[894]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:24:32.028533 initrd-setup-root[901]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:24:32.132972 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:32.135386 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:24:32.136966 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:24:32.164933 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:32.181119 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:24:32.198222 ignition[970]: INFO : Ignition 2.22.0 Dec 12 17:24:32.198222 ignition[970]: INFO : Stage: mount Dec 12 17:24:32.200097 ignition[970]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.200097 ignition[970]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.200097 ignition[970]: INFO : mount: mount passed Dec 12 17:24:32.200097 ignition[970]: INFO : Ignition finished successfully Dec 12 17:24:32.204683 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:24:32.205138 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:24:32.209270 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:24:32.235765 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:32.271001 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (980) Dec 12 17:24:32.272334 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:32.272387 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:32.277609 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:32.277664 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:32.277694 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:32.282006 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:32.315739 ignition[997]: INFO : Ignition 2.22.0 Dec 12 17:24:32.315739 ignition[997]: INFO : Stage: files Dec 12 17:24:32.316732 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.316732 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.316732 ignition[997]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:24:32.319108 ignition[997]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:24:32.319108 ignition[997]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:24:32.321108 ignition[997]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:24:32.321941 ignition[997]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:24:32.322999 unknown[997]: wrote ssh authorized keys file for user: core Dec 12 17:24:32.324015 ignition[997]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:24:32.325555 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:32.326697 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 12 17:24:32.371538 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:24:32.472494 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:32.475002 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:32.484832 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:32.484832 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:24:32.484832 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:24:32.484832 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:24:32.484832 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Dec 12 17:24:32.780459 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:24:33.068040 systemd-networkd[816]: eth0: Gained IPv6LL Dec 12 17:24:33.132065 systemd-networkd[816]: eth1: Gained IPv6LL Dec 12 17:24:33.346073 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:24:33.346073 ignition[997]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:24:33.349455 ignition[997]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:33.352166 ignition[997]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:33.352166 ignition[997]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:33.352166 ignition[997]: INFO : files: files passed Dec 12 17:24:33.352166 ignition[997]: INFO : Ignition finished successfully Dec 12 17:24:33.354158 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:24:33.356775 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:24:33.362078 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:24:33.382170 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:24:33.382714 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:24:33.392958 initrd-setup-root-after-ignition[1027]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:33.392958 initrd-setup-root-after-ignition[1027]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:33.395709 initrd-setup-root-after-ignition[1031]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:33.398481 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:33.399803 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:24:33.401746 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:24:33.456034 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:24:33.456231 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:24:33.459537 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:24:33.460233 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:24:33.461539 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:24:33.462349 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:24:33.491983 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:33.495399 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:24:33.526872 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:33.528295 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:33.529187 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:24:33.530343 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:24:33.530563 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:33.531994 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:24:33.533181 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:24:33.534131 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:24:33.535229 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:33.536332 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:33.537376 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:33.538409 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:24:33.539401 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:33.540461 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:24:33.541545 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:24:33.542477 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:24:33.543281 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:24:33.543455 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:33.544624 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:33.545659 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:33.546690 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:24:33.547150 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:33.547900 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:24:33.548075 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:33.549567 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:24:33.549736 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:33.550643 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:24:33.550783 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:24:33.551532 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 12 17:24:33.551670 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:24:33.554986 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:24:33.558258 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:24:33.558778 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:24:33.558961 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:33.561097 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:24:33.561241 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:33.567569 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:24:33.570024 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:24:33.580823 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:24:33.587073 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:24:33.587947 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:24:33.595669 ignition[1051]: INFO : Ignition 2.22.0 Dec 12 17:24:33.595669 ignition[1051]: INFO : Stage: umount Dec 12 17:24:33.596726 ignition[1051]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:33.596726 ignition[1051]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:33.596726 ignition[1051]: INFO : umount: umount passed Dec 12 17:24:33.596726 ignition[1051]: INFO : Ignition finished successfully Dec 12 17:24:33.599317 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:24:33.599516 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:24:33.601309 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:24:33.601404 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:24:33.602530 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:24:33.602573 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:24:33.603629 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:24:33.603667 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:24:33.604733 systemd[1]: Stopped target network.target - Network. Dec 12 17:24:33.605790 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:24:33.605843 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:33.606831 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:24:33.607640 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:24:33.609093 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:33.609706 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:24:33.610661 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:24:33.611534 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:24:33.611577 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:33.612406 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:24:33.612436 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:33.613495 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:24:33.613546 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:24:33.614399 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:24:33.614433 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:33.615319 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:24:33.615360 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:33.616313 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:24:33.617267 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:24:33.627468 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:24:33.627730 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:24:33.633920 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:24:33.634363 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:24:33.634583 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:24:33.638769 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:24:33.639534 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:24:33.641244 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:24:33.641290 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:33.643779 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:24:33.644379 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:24:33.644431 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:33.645168 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:24:33.645215 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:33.645900 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:24:33.645936 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:33.647009 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:24:33.647046 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:33.648831 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:33.655042 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:24:33.655119 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:33.668619 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:24:33.669917 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:33.673006 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:24:33.673075 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:33.674266 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:24:33.674301 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:33.675330 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:24:33.675379 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:33.676686 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:24:33.676734 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:33.678001 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:24:33.678051 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:33.680440 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:24:33.681121 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:24:33.681176 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:33.682440 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:24:33.682499 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:33.684622 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 12 17:24:33.684664 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:24:33.686278 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:24:33.686319 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:33.687051 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:33.687089 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:33.692267 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 17:24:33.692329 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 12 17:24:33.692361 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 17:24:33.692399 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:33.692833 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:24:33.692966 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:24:33.699174 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:24:33.699294 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:24:33.700331 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:24:33.702057 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:24:33.741833 systemd[1]: Switching root. Dec 12 17:24:33.775295 systemd-journald[245]: Journal stopped Dec 12 17:24:34.732672 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Dec 12 17:24:34.732812 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:24:34.732832 kernel: SELinux: policy capability open_perms=1 Dec 12 17:24:34.732843 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:24:34.732852 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:24:34.732888 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:24:34.732900 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:24:34.732910 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:24:34.732919 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:24:34.732931 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:24:34.732941 kernel: audit: type=1403 audit(1765560273.936:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:24:34.732957 systemd[1]: Successfully loaded SELinux policy in 51.799ms. Dec 12 17:24:34.732980 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.438ms. Dec 12 17:24:34.732992 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:34.733003 systemd[1]: Detected virtualization kvm. Dec 12 17:24:34.733013 systemd[1]: Detected architecture arm64. Dec 12 17:24:34.733024 systemd[1]: Detected first boot. Dec 12 17:24:34.733036 systemd[1]: Hostname set to . Dec 12 17:24:34.733052 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:34.733065 zram_generator::config[1095]: No configuration found. Dec 12 17:24:34.733078 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:24:34.733090 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:24:34.733129 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:24:34.733145 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:24:34.733156 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:24:34.733199 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:34.733228 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:24:34.733244 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:24:34.733255 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:24:34.733266 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:24:34.733276 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:24:34.733290 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:24:34.733301 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:24:34.733312 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:24:34.733323 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:34.733334 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:34.733345 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:24:34.733356 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:24:34.733367 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:24:34.733378 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:34.733390 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:24:34.733401 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:34.733412 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:34.733423 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:24:34.733433 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:24:34.733444 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:34.733468 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:24:34.733481 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:34.733493 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:34.733524 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:34.733538 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:34.733549 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:24:34.733589 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:24:34.733607 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:24:34.733618 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:34.733629 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:34.733643 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:34.733654 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:24:34.733665 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:24:34.733676 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:24:34.733686 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:24:34.733697 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:24:34.733708 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:24:34.733719 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:24:34.733732 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:24:34.733742 systemd[1]: Reached target machines.target - Containers. Dec 12 17:24:34.733754 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:24:34.733765 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:34.733779 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:34.733790 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:24:34.733800 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:34.733818 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:34.733829 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:34.733841 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:24:34.733852 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:34.734176 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:24:34.734199 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:24:34.734210 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:24:34.734221 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:24:34.734232 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:24:34.734248 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:34.734260 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:34.734272 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:34.734284 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:34.734297 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:24:34.734308 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:24:34.734324 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:34.734335 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:24:34.734346 systemd[1]: Stopped verity-setup.service. Dec 12 17:24:34.734403 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:24:34.734438 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:24:34.734450 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:24:34.734488 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:24:34.734518 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:24:34.734532 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:24:34.734542 kernel: loop: module loaded Dec 12 17:24:34.734554 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:34.734565 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:24:34.734576 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:24:34.734589 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:34.734599 kernel: fuse: init (API version 7.41) Dec 12 17:24:34.734610 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:34.734621 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:34.734632 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:34.734643 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:24:34.734653 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:24:34.734664 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:34.734675 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:34.734688 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:34.734699 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:34.734710 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:24:34.734721 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:34.734749 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:24:34.734767 kernel: ACPI: bus type drm_connector registered Dec 12 17:24:34.734778 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:24:34.734819 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:24:34.734832 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:34.734846 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:24:34.734905 systemd-journald[1159]: Collecting audit messages is disabled. Dec 12 17:24:34.734932 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:24:34.734943 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:34.734955 systemd-journald[1159]: Journal started Dec 12 17:24:34.734979 systemd-journald[1159]: Runtime Journal (/run/log/journal/55fc44d825a24f1296a4f844bd663fcd) is 8M, max 76.5M, 68.5M free. Dec 12 17:24:34.435888 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:24:34.463062 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 12 17:24:34.463577 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:24:34.737972 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:24:34.738008 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:34.742946 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:24:34.745891 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:34.750898 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:34.755997 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:24:34.760903 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:24:34.771412 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:34.772138 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:24:34.773631 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:34.774666 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:34.776366 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:24:34.778841 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:24:34.781091 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:24:34.790975 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:24:34.815678 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:24:34.819627 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:24:34.830512 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:24:34.837889 kernel: loop0: detected capacity change from 0 to 100632 Dec 12 17:24:34.852154 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:34.865353 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:34.869593 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:24:34.883763 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:24:34.885199 systemd-tmpfiles[1196]: ACLs are not supported, ignoring. Dec 12 17:24:34.885213 systemd-tmpfiles[1196]: ACLs are not supported, ignoring. Dec 12 17:24:34.886266 systemd-journald[1159]: Time spent on flushing to /var/log/journal/55fc44d825a24f1296a4f844bd663fcd is 33.206ms for 1185 entries. Dec 12 17:24:34.886266 systemd-journald[1159]: System Journal (/var/log/journal/55fc44d825a24f1296a4f844bd663fcd) is 8M, max 584.8M, 576.8M free. Dec 12 17:24:34.935013 systemd-journald[1159]: Received client request to flush runtime journal. Dec 12 17:24:34.935116 kernel: loop1: detected capacity change from 0 to 119840 Dec 12 17:24:34.935147 kernel: loop2: detected capacity change from 0 to 211168 Dec 12 17:24:34.895825 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:24:34.901725 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:24:34.939284 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:24:34.968883 kernel: loop3: detected capacity change from 0 to 8 Dec 12 17:24:34.975724 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:24:34.979107 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:34.984923 kernel: loop4: detected capacity change from 0 to 100632 Dec 12 17:24:35.005900 kernel: loop5: detected capacity change from 0 to 119840 Dec 12 17:24:35.014830 systemd-tmpfiles[1238]: ACLs are not supported, ignoring. Dec 12 17:24:35.015165 systemd-tmpfiles[1238]: ACLs are not supported, ignoring. Dec 12 17:24:35.021892 kernel: loop6: detected capacity change from 0 to 211168 Dec 12 17:24:35.022406 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:35.048892 kernel: loop7: detected capacity change from 0 to 8 Dec 12 17:24:35.051049 (sd-merge)[1239]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Dec 12 17:24:35.051492 (sd-merge)[1239]: Merged extensions into '/usr'. Dec 12 17:24:35.058834 systemd[1]: Reload requested from client PID 1195 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:24:35.059350 systemd[1]: Reloading... Dec 12 17:24:35.187992 zram_generator::config[1271]: No configuration found. Dec 12 17:24:35.258206 ldconfig[1191]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:24:35.389741 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:24:35.390199 systemd[1]: Reloading finished in 329 ms. Dec 12 17:24:35.404917 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:24:35.406018 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:24:35.417796 systemd[1]: Starting ensure-sysext.service... Dec 12 17:24:35.421930 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:35.445117 systemd[1]: Reload requested from client PID 1306 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:24:35.445136 systemd[1]: Reloading... Dec 12 17:24:35.453594 systemd-tmpfiles[1307]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:24:35.454001 systemd-tmpfiles[1307]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:24:35.454314 systemd-tmpfiles[1307]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:24:35.454631 systemd-tmpfiles[1307]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:24:35.455393 systemd-tmpfiles[1307]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:24:35.455732 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Dec 12 17:24:35.455844 systemd-tmpfiles[1307]: ACLs are not supported, ignoring. Dec 12 17:24:35.459310 systemd-tmpfiles[1307]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:35.459419 systemd-tmpfiles[1307]: Skipping /boot Dec 12 17:24:35.465758 systemd-tmpfiles[1307]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:35.465905 systemd-tmpfiles[1307]: Skipping /boot Dec 12 17:24:35.498892 zram_generator::config[1330]: No configuration found. Dec 12 17:24:35.671497 systemd[1]: Reloading finished in 226 ms. Dec 12 17:24:35.700563 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:24:35.708758 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:35.717069 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:24:35.722144 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:24:35.724526 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:24:35.730675 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:35.733354 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:35.740204 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:24:35.747586 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:35.750107 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:35.756424 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:35.763597 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:35.766045 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:35.766191 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:35.768552 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:35.768708 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:35.768788 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:35.788230 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:24:35.791799 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:24:35.796455 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:35.797088 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:35.799342 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:35.799951 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:35.811583 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:35.813506 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:35.815460 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:35.823143 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:35.824006 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:35.824197 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:35.827209 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:24:35.834975 systemd[1]: Finished ensure-sysext.service. Dec 12 17:24:35.843467 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:24:35.851365 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:24:35.852788 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:35.853042 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:35.856185 systemd-udevd[1377]: Using default interface naming scheme 'v255'. Dec 12 17:24:35.860226 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:35.864009 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 12 17:24:35.869971 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:24:35.889382 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:35.893936 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:35.900894 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:24:35.902286 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:35.902954 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:35.904853 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:24:35.910391 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:35.910769 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:35.913112 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:35.914843 augenrules[1416]: No rules Dec 12 17:24:35.915282 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:24:35.918930 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:24:35.922418 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:35.927505 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:36.010176 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:24:36.145906 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:24:36.213292 systemd-networkd[1434]: lo: Link UP Dec 12 17:24:36.213302 systemd-networkd[1434]: lo: Gained carrier Dec 12 17:24:36.217067 systemd-networkd[1434]: Enumeration completed Dec 12 17:24:36.217316 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:36.217802 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:36.217926 systemd-networkd[1434]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:36.219306 systemd-networkd[1434]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:36.219315 systemd-networkd[1434]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:36.219748 systemd-networkd[1434]: eth0: Link UP Dec 12 17:24:36.219845 systemd-networkd[1434]: eth0: Gained carrier Dec 12 17:24:36.219890 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:36.221217 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:24:36.223087 systemd-networkd[1434]: eth1: Link UP Dec 12 17:24:36.223911 systemd-networkd[1434]: eth1: Gained carrier Dec 12 17:24:36.224029 systemd-networkd[1434]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:36.224619 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:24:36.230491 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 12 17:24:36.236191 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:24:36.244947 systemd-networkd[1434]: eth0: DHCPv4 address 46.224.132.113/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 12 17:24:36.267016 systemd-networkd[1434]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 12 17:24:36.270568 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 12 17:24:36.270690 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:36.273666 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:36.276654 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:36.281268 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:36.281887 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:36.281924 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:36.281949 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:24:36.308885 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 12 17:24:36.308959 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 17:24:36.309009 kernel: [drm] features: -context_init Dec 12 17:24:36.311885 kernel: [drm] number of scanouts: 1 Dec 12 17:24:36.311962 kernel: [drm] number of cap sets: 0 Dec 12 17:24:36.312883 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 12 17:24:36.320982 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 17:24:36.321948 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:24:36.330571 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:24:36.332258 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:36.332479 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:36.334334 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:36.334915 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:36.336312 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:36.337205 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:36.337895 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 17:24:36.347692 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:36.347774 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:36.382599 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 12 17:24:36.383630 systemd-resolved[1376]: Positive Trust Anchors: Dec 12 17:24:36.383648 systemd-resolved[1376]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:36.383681 systemd-resolved[1376]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:36.384150 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:24:36.393766 systemd-resolved[1376]: Using system hostname 'ci-4459-2-2-0-24adfa6772'. Dec 12 17:24:36.396416 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:36.397650 systemd[1]: Reached target network.target - Network. Dec 12 17:24:36.398225 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:36.398857 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:36.399523 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:24:36.401080 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:24:36.401902 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:24:36.402877 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:24:36.403562 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:24:36.405186 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:24:36.405224 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:36.405732 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:36.407513 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:24:36.410883 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:24:36.417211 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:24:36.418178 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:24:36.418940 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:24:36.428644 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:24:36.430574 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:24:36.433959 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:24:36.436092 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:36.436841 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:36.438002 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:36.438047 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:36.440318 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:24:36.444259 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:24:36.448591 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:24:36.455813 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:24:36.463153 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:24:36.465719 systemd-timesyncd[1409]: Contacted time server 178.215.228.24:123 (0.flatcar.pool.ntp.org). Dec 12 17:24:36.466219 systemd-timesyncd[1409]: Initial clock synchronization to Fri 2025-12-12 17:24:36.270082 UTC. Dec 12 17:24:36.468678 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:24:36.469361 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:24:36.473121 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:24:36.481109 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:24:36.509661 coreos-metadata[1510]: Dec 12 17:24:36.509 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 12 17:24:36.511903 coreos-metadata[1510]: Dec 12 17:24:36.511 INFO Fetch successful Dec 12 17:24:36.512035 coreos-metadata[1510]: Dec 12 17:24:36.511 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 12 17:24:36.515296 coreos-metadata[1510]: Dec 12 17:24:36.512 INFO Fetch successful Dec 12 17:24:36.521160 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 12 17:24:36.524170 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:24:36.524783 jq[1513]: false Dec 12 17:24:36.527318 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:24:36.534097 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:24:36.536904 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:24:36.548570 extend-filesystems[1514]: Found /dev/sda6 Dec 12 17:24:36.560330 extend-filesystems[1514]: Found /dev/sda9 Dec 12 17:24:36.566416 extend-filesystems[1514]: Checking size of /dev/sda9 Dec 12 17:24:36.579018 extend-filesystems[1514]: Resized partition /dev/sda9 Dec 12 17:24:36.581234 extend-filesystems[1540]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 17:24:36.587880 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Dec 12 17:24:36.595718 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:24:36.596471 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:24:36.602377 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:24:36.610506 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:24:36.611517 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:24:36.611985 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:24:36.612562 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:24:36.612742 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:24:36.634036 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:24:36.636105 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:24:36.656669 update_engine[1542]: I20251212 17:24:36.655984 1542 main.cc:92] Flatcar Update Engine starting Dec 12 17:24:36.675139 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:36.700610 (ntainerd)[1562]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:24:36.714774 jq[1543]: true Dec 12 17:24:36.714841 dbus-daemon[1511]: [system] SELinux support is enabled Dec 12 17:24:36.715260 jq[1566]: true Dec 12 17:24:36.719611 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:24:36.725663 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:24:36.725697 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:24:36.726516 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:24:36.726536 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:24:36.730151 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:36.732594 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:36.736613 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:24:36.739702 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:24:36.740906 tar[1548]: linux-arm64/LICENSE Dec 12 17:24:36.740906 tar[1548]: linux-arm64/helm Dec 12 17:24:36.744764 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:36.746678 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:24:36.748877 update_engine[1542]: I20251212 17:24:36.748679 1542 update_check_scheduler.cc:74] Next update check in 10m5s Dec 12 17:24:36.766188 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:24:36.796900 bash[1590]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:36.799059 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:24:36.817321 systemd[1]: Starting sshkeys.service... Dec 12 17:24:36.846889 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Dec 12 17:24:36.877413 extend-filesystems[1540]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 12 17:24:36.877413 extend-filesystems[1540]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 12 17:24:36.877413 extend-filesystems[1540]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Dec 12 17:24:36.885431 extend-filesystems[1514]: Resized filesystem in /dev/sda9 Dec 12 17:24:36.879766 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 17:24:36.886637 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 17:24:36.887727 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:24:36.889022 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:24:36.967013 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:37.017399 coreos-metadata[1603]: Dec 12 17:24:37.017 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 12 17:24:37.025284 coreos-metadata[1603]: Dec 12 17:24:37.018 INFO Fetch successful Dec 12 17:24:37.027445 unknown[1603]: wrote ssh authorized keys file for user: core Dec 12 17:24:37.063192 containerd[1562]: time="2025-12-12T17:24:37Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:24:37.064111 containerd[1562]: time="2025-12-12T17:24:37.064074303Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:24:37.081230 update-ssh-keys[1612]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:37.080612 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 17:24:37.084035 systemd[1]: Finished sshkeys.service. Dec 12 17:24:37.094108 containerd[1562]: time="2025-12-12T17:24:37.091611399Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.488µs" Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.094993847Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095045317Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095247842Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095268446Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095296932Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095356871Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095368967Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095592097Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095607549Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095618593Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095626982Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:24:37.095957 containerd[1562]: time="2025-12-12T17:24:37.095704012Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099157558Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099242276Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099255621Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099294839Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099559292Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:24:37.099797 containerd[1562]: time="2025-12-12T17:24:37.099652829Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112160783Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112253031Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112272621Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112289361Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112362489Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112374625Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112392692Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112404867Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112421998Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112432495Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112441314Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112454737Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112607392Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:24:37.113883 containerd[1562]: time="2025-12-12T17:24:37.112673184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112688520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112708694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112721064Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112734254Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112747014Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112756848Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112769100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112780456Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112790992Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.112993205Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.113016970Z" level=info msg="Start snapshots syncer" Dec 12 17:24:37.114214 containerd[1562]: time="2025-12-12T17:24:37.113051543Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:24:37.114411 containerd[1562]: time="2025-12-12T17:24:37.113300896Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:24:37.114411 containerd[1562]: time="2025-12-12T17:24:37.113348659Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113404461Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113515284Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113543419Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113554774Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113566910Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113578031Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113601991Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113615727Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113641091Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113653969Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113664349Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113706610Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113725223Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:37.114522 containerd[1562]: time="2025-12-12T17:24:37.113735408Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:37.114849 containerd[1562]: time="2025-12-12T17:24:37.113750900Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:37.114849 containerd[1562]: time="2025-12-12T17:24:37.113758900Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:24:37.114849 containerd[1562]: time="2025-12-12T17:24:37.113769162Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:24:37.114849 containerd[1562]: time="2025-12-12T17:24:37.113779972Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.118058604Z" level=info msg="runtime interface created" Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.118089549Z" level=info msg="created NRI interface" Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.118105704Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.118132434Z" level=info msg="Connect containerd service" Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.118169232Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:24:37.121881 containerd[1562]: time="2025-12-12T17:24:37.119074743Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:24:37.133273 systemd-logind[1523]: New seat seat0. Dec 12 17:24:37.137456 systemd-logind[1523]: Watching system buttons on /dev/input/event0 (Power Button) Dec 12 17:24:37.137697 systemd-logind[1523]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 12 17:24:37.138028 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:24:37.246057 sshd_keygen[1555]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:24:37.271414 locksmithd[1591]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:24:37.275943 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:24:37.280079 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:24:37.292606 systemd-networkd[1434]: eth1: Gained IPv6LL Dec 12 17:24:37.297935 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:24:37.299615 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:24:37.303989 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:37.306572 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:24:37.310780 containerd[1562]: time="2025-12-12T17:24:37.310314733Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:24:37.311702 containerd[1562]: time="2025-12-12T17:24:37.310911929Z" level=info msg="Start subscribing containerd event" Dec 12 17:24:37.311774 containerd[1562]: time="2025-12-12T17:24:37.311718402Z" level=info msg="Start recovering state" Dec 12 17:24:37.311832 containerd[1562]: time="2025-12-12T17:24:37.311812758Z" level=info msg="Start event monitor" Dec 12 17:24:37.311832 containerd[1562]: time="2025-12-12T17:24:37.311831645Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:24:37.311895 containerd[1562]: time="2025-12-12T17:24:37.311847293Z" level=info msg="Start streaming server" Dec 12 17:24:37.311895 containerd[1562]: time="2025-12-12T17:24:37.311867701Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:24:37.311895 containerd[1562]: time="2025-12-12T17:24:37.311876676Z" level=info msg="runtime interface starting up..." Dec 12 17:24:37.311895 containerd[1562]: time="2025-12-12T17:24:37.311882413Z" level=info msg="starting plugins..." Dec 12 17:24:37.311970 containerd[1562]: time="2025-12-12T17:24:37.311896539Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:24:37.312012 containerd[1562]: time="2025-12-12T17:24:37.311667907Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:24:37.312130 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:24:37.312989 containerd[1562]: time="2025-12-12T17:24:37.312733917Z" level=info msg="containerd successfully booted in 0.250126s" Dec 12 17:24:37.318331 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:24:37.318886 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:24:37.326972 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:24:37.366074 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:24:37.370199 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:24:37.376632 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:24:37.378215 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:24:37.387480 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:24:37.441431 tar[1548]: linux-arm64/README.md Dec 12 17:24:37.459468 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:24:37.868056 systemd-networkd[1434]: eth0: Gained IPv6LL Dec 12 17:24:38.129892 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:38.131588 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:24:38.133241 systemd[1]: Startup finished in 2.300s (kernel) + 5.333s (initrd) + 4.248s (userspace) = 11.881s. Dec 12 17:24:38.150995 (kubelet)[1670]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:38.637209 kubelet[1670]: E1212 17:24:38.637148 1670 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:38.641969 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:38.642152 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:38.643998 systemd[1]: kubelet.service: Consumed 875ms CPU time, 255.4M memory peak. Dec 12 17:24:48.755144 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:48.757737 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:48.932724 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:48.942742 (kubelet)[1690]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:49.002377 kubelet[1690]: E1212 17:24:49.002311 1690 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:49.008482 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:49.008687 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:49.009488 systemd[1]: kubelet.service: Consumed 185ms CPU time, 105.9M memory peak. Dec 12 17:24:59.255306 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:24:59.258787 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:59.434401 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:59.446373 (kubelet)[1704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:59.491287 kubelet[1704]: E1212 17:24:59.491222 1704 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:59.496763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:59.497405 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:59.498614 systemd[1]: kubelet.service: Consumed 171ms CPU time, 107.1M memory peak. Dec 12 17:25:09.505106 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:25:09.508081 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:09.682318 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:09.690584 (kubelet)[1720]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:09.737658 kubelet[1720]: E1212 17:25:09.737611 1720 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:09.741386 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:09.741563 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:09.744000 systemd[1]: kubelet.service: Consumed 165ms CPU time, 107.2M memory peak. Dec 12 17:25:12.420252 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:25:12.422525 systemd[1]: Started sshd@0-46.224.132.113:22-139.178.89.65:37006.service - OpenSSH per-connection server daemon (139.178.89.65:37006). Dec 12 17:25:13.426171 sshd[1727]: Accepted publickey for core from 139.178.89.65 port 37006 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:13.428987 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:13.437243 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:25:13.439218 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:25:13.449117 systemd-logind[1523]: New session 1 of user core. Dec 12 17:25:13.465829 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:25:13.470625 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:25:13.485964 (systemd)[1732]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:25:13.490756 systemd-logind[1523]: New session c1 of user core. Dec 12 17:25:13.623813 systemd[1732]: Queued start job for default target default.target. Dec 12 17:25:13.635014 systemd[1732]: Created slice app.slice - User Application Slice. Dec 12 17:25:13.635045 systemd[1732]: Reached target paths.target - Paths. Dec 12 17:25:13.635085 systemd[1732]: Reached target timers.target - Timers. Dec 12 17:25:13.636542 systemd[1732]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:25:13.651554 systemd[1732]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:25:13.651836 systemd[1732]: Reached target sockets.target - Sockets. Dec 12 17:25:13.652010 systemd[1732]: Reached target basic.target - Basic System. Dec 12 17:25:13.652151 systemd[1732]: Reached target default.target - Main User Target. Dec 12 17:25:13.652170 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:25:13.652313 systemd[1732]: Startup finished in 153ms. Dec 12 17:25:13.663671 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:25:14.349815 systemd[1]: Started sshd@1-46.224.132.113:22-139.178.89.65:37020.service - OpenSSH per-connection server daemon (139.178.89.65:37020). Dec 12 17:25:15.355570 sshd[1743]: Accepted publickey for core from 139.178.89.65 port 37020 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:15.357611 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:15.364823 systemd-logind[1523]: New session 2 of user core. Dec 12 17:25:15.376121 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:25:16.033393 sshd[1746]: Connection closed by 139.178.89.65 port 37020 Dec 12 17:25:16.034274 sshd-session[1743]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:16.039420 systemd[1]: sshd@1-46.224.132.113:22-139.178.89.65:37020.service: Deactivated successfully. Dec 12 17:25:16.039619 systemd-logind[1523]: Session 2 logged out. Waiting for processes to exit. Dec 12 17:25:16.044248 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 17:25:16.047021 systemd-logind[1523]: Removed session 2. Dec 12 17:25:16.203062 systemd[1]: Started sshd@2-46.224.132.113:22-139.178.89.65:37024.service - OpenSSH per-connection server daemon (139.178.89.65:37024). Dec 12 17:25:17.196430 sshd[1752]: Accepted publickey for core from 139.178.89.65 port 37024 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:17.198341 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:17.203818 systemd-logind[1523]: New session 3 of user core. Dec 12 17:25:17.214250 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:25:17.866915 sshd[1755]: Connection closed by 139.178.89.65 port 37024 Dec 12 17:25:17.866952 sshd-session[1752]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:17.872742 systemd[1]: sshd@2-46.224.132.113:22-139.178.89.65:37024.service: Deactivated successfully. Dec 12 17:25:17.874969 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 17:25:17.877345 systemd-logind[1523]: Session 3 logged out. Waiting for processes to exit. Dec 12 17:25:17.878644 systemd-logind[1523]: Removed session 3. Dec 12 17:25:18.034320 systemd[1]: Started sshd@3-46.224.132.113:22-139.178.89.65:37036.service - OpenSSH per-connection server daemon (139.178.89.65:37036). Dec 12 17:25:19.026668 sshd[1761]: Accepted publickey for core from 139.178.89.65 port 37036 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:19.028778 sshd-session[1761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:19.034284 systemd-logind[1523]: New session 4 of user core. Dec 12 17:25:19.041187 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:25:19.694124 sshd[1764]: Connection closed by 139.178.89.65 port 37036 Dec 12 17:25:19.695328 sshd-session[1761]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:19.700382 systemd[1]: sshd@3-46.224.132.113:22-139.178.89.65:37036.service: Deactivated successfully. Dec 12 17:25:19.702539 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:25:19.704965 systemd-logind[1523]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:25:19.707654 systemd-logind[1523]: Removed session 4. Dec 12 17:25:19.755416 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 17:25:19.758439 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:19.864215 systemd[1]: Started sshd@4-46.224.132.113:22-139.178.89.65:37042.service - OpenSSH per-connection server daemon (139.178.89.65:37042). Dec 12 17:25:19.913370 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:19.929740 (kubelet)[1781]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:19.975114 kubelet[1781]: E1212 17:25:19.974664 1781 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:19.977959 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:19.978158 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:19.979049 systemd[1]: kubelet.service: Consumed 164ms CPU time, 107.1M memory peak. Dec 12 17:25:20.853043 sshd[1773]: Accepted publickey for core from 139.178.89.65 port 37042 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:20.855523 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:20.864018 systemd-logind[1523]: New session 5 of user core. Dec 12 17:25:20.880733 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:25:21.382537 sudo[1789]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:25:21.382817 sudo[1789]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:21.396308 sudo[1789]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:21.554550 sshd[1788]: Connection closed by 139.178.89.65 port 37042 Dec 12 17:25:21.556124 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:21.562428 systemd-logind[1523]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:25:21.563281 systemd[1]: sshd@4-46.224.132.113:22-139.178.89.65:37042.service: Deactivated successfully. Dec 12 17:25:21.565069 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:25:21.566849 systemd-logind[1523]: Removed session 5. Dec 12 17:25:21.731012 systemd[1]: Started sshd@5-46.224.132.113:22-139.178.89.65:48272.service - OpenSSH per-connection server daemon (139.178.89.65:48272). Dec 12 17:25:22.252108 update_engine[1542]: I20251212 17:25:22.251988 1542 update_attempter.cc:509] Updating boot flags... Dec 12 17:25:22.727942 sshd[1795]: Accepted publickey for core from 139.178.89.65 port 48272 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:22.730547 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:22.738954 systemd-logind[1523]: New session 6 of user core. Dec 12 17:25:22.744189 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:25:23.242595 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:25:23.243261 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:23.251548 sudo[1820]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:23.259258 sudo[1819]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:25:23.259667 sudo[1819]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:23.272256 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:25:23.335639 augenrules[1842]: No rules Dec 12 17:25:23.337242 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:25:23.337660 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:25:23.341793 sudo[1819]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:23.499565 sshd[1818]: Connection closed by 139.178.89.65 port 48272 Dec 12 17:25:23.499459 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:23.504556 systemd[1]: sshd@5-46.224.132.113:22-139.178.89.65:48272.service: Deactivated successfully. Dec 12 17:25:23.504722 systemd-logind[1523]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:25:23.508980 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:25:23.511301 systemd-logind[1523]: Removed session 6. Dec 12 17:25:23.672682 systemd[1]: Started sshd@6-46.224.132.113:22-139.178.89.65:48274.service - OpenSSH per-connection server daemon (139.178.89.65:48274). Dec 12 17:25:24.686232 sshd[1851]: Accepted publickey for core from 139.178.89.65 port 48274 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:24.688857 sshd-session[1851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:24.695990 systemd-logind[1523]: New session 7 of user core. Dec 12 17:25:24.703189 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:25:25.208765 sudo[1855]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:25:25.209089 sudo[1855]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:25.544139 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:25:25.570749 (dockerd)[1873]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:25:25.794126 dockerd[1873]: time="2025-12-12T17:25:25.794069966Z" level=info msg="Starting up" Dec 12 17:25:25.798022 dockerd[1873]: time="2025-12-12T17:25:25.797140881Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:25:25.813092 dockerd[1873]: time="2025-12-12T17:25:25.813046424Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:25:25.849065 dockerd[1873]: time="2025-12-12T17:25:25.849016119Z" level=info msg="Loading containers: start." Dec 12 17:25:25.859902 kernel: Initializing XFRM netlink socket Dec 12 17:25:26.124387 systemd-networkd[1434]: docker0: Link UP Dec 12 17:25:26.128887 dockerd[1873]: time="2025-12-12T17:25:26.128770953Z" level=info msg="Loading containers: done." Dec 12 17:25:26.148620 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4055845766-merged.mount: Deactivated successfully. Dec 12 17:25:26.150908 dockerd[1873]: time="2025-12-12T17:25:26.150747275Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:25:26.151566 dockerd[1873]: time="2025-12-12T17:25:26.151126959Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:25:26.151566 dockerd[1873]: time="2025-12-12T17:25:26.151291481Z" level=info msg="Initializing buildkit" Dec 12 17:25:26.179168 dockerd[1873]: time="2025-12-12T17:25:26.179125266Z" level=info msg="Completed buildkit initialization" Dec 12 17:25:26.189612 dockerd[1873]: time="2025-12-12T17:25:26.189558501Z" level=info msg="Daemon has completed initialization" Dec 12 17:25:26.190212 dockerd[1873]: time="2025-12-12T17:25:26.190057827Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:25:26.190660 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:25:27.327776 containerd[1562]: time="2025-12-12T17:25:27.327739444Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 12 17:25:27.922454 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2293075124.mount: Deactivated successfully. Dec 12 17:25:28.869623 containerd[1562]: time="2025-12-12T17:25:28.869508686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.870807 containerd[1562]: time="2025-12-12T17:25:28.870768819Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=27387379" Dec 12 17:25:28.872879 containerd[1562]: time="2025-12-12T17:25:28.872634438Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.876695 containerd[1562]: time="2025-12-12T17:25:28.876639998Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.877883 containerd[1562]: time="2025-12-12T17:25:28.877835530Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.550057165s" Dec 12 17:25:28.878012 containerd[1562]: time="2025-12-12T17:25:28.877996171Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Dec 12 17:25:28.880171 containerd[1562]: time="2025-12-12T17:25:28.880079912Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 12 17:25:30.004610 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 12 17:25:30.008031 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:30.148959 containerd[1562]: time="2025-12-12T17:25:30.148415906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.150437 containerd[1562]: time="2025-12-12T17:25:30.150108722Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23553101" Dec 12 17:25:30.151244 containerd[1562]: time="2025-12-12T17:25:30.150755408Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.154577 containerd[1562]: time="2025-12-12T17:25:30.154515842Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.156413 containerd[1562]: time="2025-12-12T17:25:30.155547092Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.275371699s" Dec 12 17:25:30.156413 containerd[1562]: time="2025-12-12T17:25:30.155589772Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Dec 12 17:25:30.156776 containerd[1562]: time="2025-12-12T17:25:30.156751863Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 12 17:25:30.164886 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:30.177689 (kubelet)[2153]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:30.228642 kubelet[2153]: E1212 17:25:30.228530 2153 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:30.233554 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:30.233721 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:30.236031 systemd[1]: kubelet.service: Consumed 167ms CPU time, 107.3M memory peak. Dec 12 17:25:31.206621 containerd[1562]: time="2025-12-12T17:25:31.205666480Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:31.207195 containerd[1562]: time="2025-12-12T17:25:31.207151453Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18298087" Dec 12 17:25:31.207783 containerd[1562]: time="2025-12-12T17:25:31.207748658Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:31.211351 containerd[1562]: time="2025-12-12T17:25:31.211303890Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:31.212657 containerd[1562]: time="2025-12-12T17:25:31.212616701Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.055645836s" Dec 12 17:25:31.212657 containerd[1562]: time="2025-12-12T17:25:31.212653222Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Dec 12 17:25:31.213118 containerd[1562]: time="2025-12-12T17:25:31.213095585Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 12 17:25:32.180309 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3816519721.mount: Deactivated successfully. Dec 12 17:25:32.525053 containerd[1562]: time="2025-12-12T17:25:32.524964597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.526172 containerd[1562]: time="2025-12-12T17:25:32.526106366Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28258699" Dec 12 17:25:32.527052 containerd[1562]: time="2025-12-12T17:25:32.526993614Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.530137 containerd[1562]: time="2025-12-12T17:25:32.530072720Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.530660 containerd[1562]: time="2025-12-12T17:25:32.530467923Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.316601691s" Dec 12 17:25:32.530660 containerd[1562]: time="2025-12-12T17:25:32.530499883Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Dec 12 17:25:32.531276 containerd[1562]: time="2025-12-12T17:25:32.531217330Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 12 17:25:33.116747 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2354752075.mount: Deactivated successfully. Dec 12 17:25:33.761646 containerd[1562]: time="2025-12-12T17:25:33.761594792Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.763682 containerd[1562]: time="2025-12-12T17:25:33.763614768Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Dec 12 17:25:33.765889 containerd[1562]: time="2025-12-12T17:25:33.765074660Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.768445 containerd[1562]: time="2025-12-12T17:25:33.768398567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.770505 containerd[1562]: time="2025-12-12T17:25:33.770439223Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.239181373s" Dec 12 17:25:33.770505 containerd[1562]: time="2025-12-12T17:25:33.770500304Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Dec 12 17:25:33.771400 containerd[1562]: time="2025-12-12T17:25:33.771363671Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 17:25:34.300076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount508947970.mount: Deactivated successfully. Dec 12 17:25:34.307028 containerd[1562]: time="2025-12-12T17:25:34.306962144Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:34.308108 containerd[1562]: time="2025-12-12T17:25:34.308052992Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 12 17:25:34.308923 containerd[1562]: time="2025-12-12T17:25:34.308576596Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:34.310612 containerd[1562]: time="2025-12-12T17:25:34.310551292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:34.311464 containerd[1562]: time="2025-12-12T17:25:34.311304218Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 539.903226ms" Dec 12 17:25:34.311464 containerd[1562]: time="2025-12-12T17:25:34.311339818Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 12 17:25:34.312029 containerd[1562]: time="2025-12-12T17:25:34.311998983Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 12 17:25:34.889253 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount659154236.mount: Deactivated successfully. Dec 12 17:25:36.287434 containerd[1562]: time="2025-12-12T17:25:36.287373935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:36.288738 containerd[1562]: time="2025-12-12T17:25:36.288703144Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=70013713" Dec 12 17:25:36.290881 containerd[1562]: time="2025-12-12T17:25:36.289395589Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:36.292437 containerd[1562]: time="2025-12-12T17:25:36.292399451Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:36.293685 containerd[1562]: time="2025-12-12T17:25:36.293651260Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 1.981619837s" Dec 12 17:25:36.293783 containerd[1562]: time="2025-12-12T17:25:36.293767460Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Dec 12 17:25:40.255464 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Dec 12 17:25:40.259085 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:40.416120 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:40.426659 (kubelet)[2312]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:40.468502 kubelet[2312]: E1212 17:25:40.468452 2312 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:40.472169 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:40.472298 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:40.473249 systemd[1]: kubelet.service: Consumed 157ms CPU time, 105.7M memory peak. Dec 12 17:25:41.313798 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:41.314370 systemd[1]: kubelet.service: Consumed 157ms CPU time, 105.7M memory peak. Dec 12 17:25:41.317540 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:41.350818 systemd[1]: Reload requested from client PID 2326 ('systemctl') (unit session-7.scope)... Dec 12 17:25:41.350838 systemd[1]: Reloading... Dec 12 17:25:41.482902 zram_generator::config[2370]: No configuration found. Dec 12 17:25:41.663044 systemd[1]: Reloading finished in 311 ms. Dec 12 17:25:41.731541 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:25:41.731826 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:25:41.732333 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:41.733940 systemd[1]: kubelet.service: Consumed 109ms CPU time, 95M memory peak. Dec 12 17:25:41.737210 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:41.900305 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:41.916973 (kubelet)[2418]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:25:41.964718 kubelet[2418]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:41.965896 kubelet[2418]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:25:41.965896 kubelet[2418]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:41.965896 kubelet[2418]: I1212 17:25:41.965281 2418 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:25:42.280159 kubelet[2418]: I1212 17:25:42.280097 2418 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 17:25:42.280159 kubelet[2418]: I1212 17:25:42.280145 2418 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:25:42.280691 kubelet[2418]: I1212 17:25:42.280661 2418 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:25:42.311622 kubelet[2418]: E1212 17:25:42.311565 2418 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://46.224.132.113:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:25:42.313890 kubelet[2418]: I1212 17:25:42.313463 2418 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:42.323493 kubelet[2418]: I1212 17:25:42.323452 2418 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:25:42.326431 kubelet[2418]: I1212 17:25:42.326402 2418 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:25:42.328126 kubelet[2418]: I1212 17:25:42.328079 2418 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:25:42.328384 kubelet[2418]: I1212 17:25:42.328221 2418 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-0-24adfa6772","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:25:42.328591 kubelet[2418]: I1212 17:25:42.328576 2418 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:25:42.328642 kubelet[2418]: I1212 17:25:42.328635 2418 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 17:25:42.329924 kubelet[2418]: I1212 17:25:42.329710 2418 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:42.333361 kubelet[2418]: I1212 17:25:42.333148 2418 kubelet.go:480] "Attempting to sync node with API server" Dec 12 17:25:42.333361 kubelet[2418]: I1212 17:25:42.333179 2418 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:25:42.333361 kubelet[2418]: I1212 17:25:42.333204 2418 kubelet.go:386] "Adding apiserver pod source" Dec 12 17:25:42.333361 kubelet[2418]: I1212 17:25:42.333219 2418 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:25:42.340239 kubelet[2418]: E1212 17:25:42.340207 2418 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.132.113:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-0-24adfa6772&limit=500&resourceVersion=0\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:25:42.341920 kubelet[2418]: I1212 17:25:42.340693 2418 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:25:42.341920 kubelet[2418]: I1212 17:25:42.341689 2418 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:25:42.341920 kubelet[2418]: W1212 17:25:42.341846 2418 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:25:42.344457 kubelet[2418]: E1212 17:25:42.344352 2418 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.132.113:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:25:42.346386 kubelet[2418]: I1212 17:25:42.346353 2418 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:25:42.346442 kubelet[2418]: I1212 17:25:42.346425 2418 server.go:1289] "Started kubelet" Dec 12 17:25:42.347036 kubelet[2418]: I1212 17:25:42.346976 2418 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:25:42.352455 kubelet[2418]: I1212 17:25:42.352156 2418 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:25:42.352816 kubelet[2418]: I1212 17:25:42.352785 2418 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:25:42.355288 kubelet[2418]: E1212 17:25:42.353263 2418 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.224.132.113:6443/api/v1/namespaces/default/events\": dial tcp 46.224.132.113:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-0-24adfa6772.188087c6eb12aadd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-0-24adfa6772,UID:ci-4459-2-2-0-24adfa6772,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-0-24adfa6772,},FirstTimestamp:2025-12-12 17:25:42.346377949 +0000 UTC m=+0.423114646,LastTimestamp:2025-12-12 17:25:42.346377949 +0000 UTC m=+0.423114646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-0-24adfa6772,}" Dec 12 17:25:42.359411 kubelet[2418]: I1212 17:25:42.359378 2418 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:25:42.361329 kubelet[2418]: I1212 17:25:42.361278 2418 server.go:317] "Adding debug handlers to kubelet server" Dec 12 17:25:42.363651 kubelet[2418]: I1212 17:25:42.363629 2418 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:25:42.364029 kubelet[2418]: E1212 17:25:42.364004 2418 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-0-24adfa6772\" not found" Dec 12 17:25:42.365615 kubelet[2418]: I1212 17:25:42.365579 2418 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:25:42.367089 kubelet[2418]: E1212 17:25:42.367035 2418 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.132.113:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-24adfa6772?timeout=10s\": dial tcp 46.224.132.113:6443: connect: connection refused" interval="200ms" Dec 12 17:25:42.367182 kubelet[2418]: I1212 17:25:42.367101 2418 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:25:42.367448 kubelet[2418]: I1212 17:25:42.367415 2418 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:25:42.367546 kubelet[2418]: I1212 17:25:42.367513 2418 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:25:42.368005 kubelet[2418]: E1212 17:25:42.367976 2418 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:25:42.368543 kubelet[2418]: I1212 17:25:42.368518 2418 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 17:25:42.371318 kubelet[2418]: E1212 17:25:42.371286 2418 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.132.113:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:25:42.371404 kubelet[2418]: I1212 17:25:42.371362 2418 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:25:42.379648 kubelet[2418]: I1212 17:25:42.379598 2418 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:25:42.388327 kubelet[2418]: I1212 17:25:42.388026 2418 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 17:25:42.388327 kubelet[2418]: I1212 17:25:42.388052 2418 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 17:25:42.388327 kubelet[2418]: I1212 17:25:42.388077 2418 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:25:42.388327 kubelet[2418]: I1212 17:25:42.388088 2418 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 17:25:42.388327 kubelet[2418]: E1212 17:25:42.388129 2418 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:25:42.392257 kubelet[2418]: E1212 17:25:42.392198 2418 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.132.113:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:25:42.411168 kubelet[2418]: I1212 17:25:42.411140 2418 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:25:42.411545 kubelet[2418]: I1212 17:25:42.411307 2418 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:25:42.411545 kubelet[2418]: I1212 17:25:42.411333 2418 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:42.414408 kubelet[2418]: I1212 17:25:42.414381 2418 policy_none.go:49] "None policy: Start" Dec 12 17:25:42.414540 kubelet[2418]: I1212 17:25:42.414524 2418 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:25:42.414618 kubelet[2418]: I1212 17:25:42.414606 2418 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:25:42.422637 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:25:42.440523 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:25:42.447310 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:25:42.463567 kubelet[2418]: E1212 17:25:42.463517 2418 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:25:42.464210 kubelet[2418]: I1212 17:25:42.463940 2418 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:25:42.464210 kubelet[2418]: I1212 17:25:42.463974 2418 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:25:42.464680 kubelet[2418]: I1212 17:25:42.464502 2418 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:25:42.467379 kubelet[2418]: E1212 17:25:42.467187 2418 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:25:42.467379 kubelet[2418]: E1212 17:25:42.467244 2418 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-0-24adfa6772\" not found" Dec 12 17:25:42.503997 systemd[1]: Created slice kubepods-burstable-pod830538c03d26db236d04c07f303f646f.slice - libcontainer container kubepods-burstable-pod830538c03d26db236d04c07f303f646f.slice. Dec 12 17:25:42.515442 kubelet[2418]: E1212 17:25:42.515391 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.521660 systemd[1]: Created slice kubepods-burstable-pod80e3df58bcbdf1a5a7f49dc84715f1dc.slice - libcontainer container kubepods-burstable-pod80e3df58bcbdf1a5a7f49dc84715f1dc.slice. Dec 12 17:25:42.524224 kubelet[2418]: E1212 17:25:42.524174 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.525960 systemd[1]: Created slice kubepods-burstable-pod8ab144787e02705192adb6507b00eca1.slice - libcontainer container kubepods-burstable-pod8ab144787e02705192adb6507b00eca1.slice. Dec 12 17:25:42.527984 kubelet[2418]: E1212 17:25:42.527960 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.569035 kubelet[2418]: I1212 17:25:42.568076 2418 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.569126 kubelet[2418]: E1212 17:25:42.569058 2418 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.132.113:6443/api/v1/nodes\": dial tcp 46.224.132.113:6443: connect: connection refused" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.569940 kubelet[2418]: E1212 17:25:42.569897 2418 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.132.113:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-24adfa6772?timeout=10s\": dial tcp 46.224.132.113:6443: connect: connection refused" interval="400ms" Dec 12 17:25:42.573570 kubelet[2418]: I1212 17:25:42.573373 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.573691 kubelet[2418]: I1212 17:25:42.573624 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.573946 kubelet[2418]: I1212 17:25:42.573836 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574023 kubelet[2418]: I1212 17:25:42.573974 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574090 kubelet[2418]: I1212 17:25:42.574052 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574191 kubelet[2418]: I1212 17:25:42.574144 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574280 kubelet[2418]: I1212 17:25:42.574217 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574348 kubelet[2418]: I1212 17:25:42.574266 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.574348 kubelet[2418]: I1212 17:25:42.574323 2418 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8ab144787e02705192adb6507b00eca1-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-0-24adfa6772\" (UID: \"8ab144787e02705192adb6507b00eca1\") " pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.772185 kubelet[2418]: I1212 17:25:42.772145 2418 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.772679 kubelet[2418]: E1212 17:25:42.772639 2418 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.132.113:6443/api/v1/nodes\": dial tcp 46.224.132.113:6443: connect: connection refused" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:42.817749 containerd[1562]: time="2025-12-12T17:25:42.817586581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-0-24adfa6772,Uid:830538c03d26db236d04c07f303f646f,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:42.825822 containerd[1562]: time="2025-12-12T17:25:42.825316026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-0-24adfa6772,Uid:80e3df58bcbdf1a5a7f49dc84715f1dc,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:42.829258 containerd[1562]: time="2025-12-12T17:25:42.829215369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-0-24adfa6772,Uid:8ab144787e02705192adb6507b00eca1,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:42.862249 containerd[1562]: time="2025-12-12T17:25:42.862135161Z" level=info msg="connecting to shim b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba" address="unix:///run/containerd/s/666e9eecd4eb48464eb3f3658ae88f993cd52af3aabf770e3f0fcd5b30c8a25e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:42.880155 containerd[1562]: time="2025-12-12T17:25:42.880049986Z" level=info msg="connecting to shim a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796" address="unix:///run/containerd/s/f5a5b54b7ce646a0c511cfefd1dd137a3bc94e9ca97bab77ce392b893a1b22ad" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:42.880690 containerd[1562]: time="2025-12-12T17:25:42.880654909Z" level=info msg="connecting to shim 66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58" address="unix:///run/containerd/s/35d8e11efdb9e137958ace600bd84c23d885c8ad17f33b0f11f55af5270302e1" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:42.907117 systemd[1]: Started cri-containerd-b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba.scope - libcontainer container b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba. Dec 12 17:25:42.911841 systemd[1]: Started cri-containerd-a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796.scope - libcontainer container a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796. Dec 12 17:25:42.928039 systemd[1]: Started cri-containerd-66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58.scope - libcontainer container 66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58. Dec 12 17:25:42.981481 kubelet[2418]: E1212 17:25:42.981442 2418 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.132.113:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-24adfa6772?timeout=10s\": dial tcp 46.224.132.113:6443: connect: connection refused" interval="800ms" Dec 12 17:25:42.984893 containerd[1562]: time="2025-12-12T17:25:42.983407469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-0-24adfa6772,Uid:830538c03d26db236d04c07f303f646f,Namespace:kube-system,Attempt:0,} returns sandbox id \"b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba\"" Dec 12 17:25:42.992777 containerd[1562]: time="2025-12-12T17:25:42.992529762Z" level=info msg="CreateContainer within sandbox \"b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:25:43.007712 containerd[1562]: time="2025-12-12T17:25:43.007466089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-0-24adfa6772,Uid:80e3df58bcbdf1a5a7f49dc84715f1dc,Namespace:kube-system,Attempt:0,} returns sandbox id \"a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796\"" Dec 12 17:25:43.010504 containerd[1562]: time="2025-12-12T17:25:43.010338945Z" level=info msg="Container 09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:43.012689 containerd[1562]: time="2025-12-12T17:25:43.012615038Z" level=info msg="CreateContainer within sandbox \"a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:25:43.026569 containerd[1562]: time="2025-12-12T17:25:43.026412516Z" level=info msg="CreateContainer within sandbox \"b9c6ade0bfccd4a27c9866802d30677807aadde7ccb08bd5a77126d644afdfba\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3\"" Dec 12 17:25:43.030150 containerd[1562]: time="2025-12-12T17:25:43.029156571Z" level=info msg="StartContainer for \"09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3\"" Dec 12 17:25:43.031382 containerd[1562]: time="2025-12-12T17:25:43.031343904Z" level=info msg="connecting to shim 09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3" address="unix:///run/containerd/s/666e9eecd4eb48464eb3f3658ae88f993cd52af3aabf770e3f0fcd5b30c8a25e" protocol=ttrpc version=3 Dec 12 17:25:43.032334 containerd[1562]: time="2025-12-12T17:25:43.032301589Z" level=info msg="Container 141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:43.033222 containerd[1562]: time="2025-12-12T17:25:43.033168314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-0-24adfa6772,Uid:8ab144787e02705192adb6507b00eca1,Namespace:kube-system,Attempt:0,} returns sandbox id \"66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58\"" Dec 12 17:25:43.043102 containerd[1562]: time="2025-12-12T17:25:43.043022930Z" level=info msg="CreateContainer within sandbox \"66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:25:43.046403 containerd[1562]: time="2025-12-12T17:25:43.046332669Z" level=info msg="CreateContainer within sandbox \"a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc\"" Dec 12 17:25:43.047020 containerd[1562]: time="2025-12-12T17:25:43.046981912Z" level=info msg="StartContainer for \"141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc\"" Dec 12 17:25:43.050351 containerd[1562]: time="2025-12-12T17:25:43.050299611Z" level=info msg="connecting to shim 141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc" address="unix:///run/containerd/s/f5a5b54b7ce646a0c511cfefd1dd137a3bc94e9ca97bab77ce392b893a1b22ad" protocol=ttrpc version=3 Dec 12 17:25:43.066009 containerd[1562]: time="2025-12-12T17:25:43.065971420Z" level=info msg="Container a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:43.069079 systemd[1]: Started cri-containerd-09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3.scope - libcontainer container 09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3. Dec 12 17:25:43.072846 systemd[1]: Started cri-containerd-141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc.scope - libcontainer container 141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc. Dec 12 17:25:43.080559 containerd[1562]: time="2025-12-12T17:25:43.080446902Z" level=info msg="CreateContainer within sandbox \"66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c\"" Dec 12 17:25:43.081922 containerd[1562]: time="2025-12-12T17:25:43.081841510Z" level=info msg="StartContainer for \"a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c\"" Dec 12 17:25:43.084409 containerd[1562]: time="2025-12-12T17:25:43.084374084Z" level=info msg="connecting to shim a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c" address="unix:///run/containerd/s/35d8e11efdb9e137958ace600bd84c23d885c8ad17f33b0f11f55af5270302e1" protocol=ttrpc version=3 Dec 12 17:25:43.127165 systemd[1]: Started cri-containerd-a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c.scope - libcontainer container a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c. Dec 12 17:25:43.145892 containerd[1562]: time="2025-12-12T17:25:43.143685460Z" level=info msg="StartContainer for \"09ca01f0a3b1e78e5e33ed4c5810200da12fd3823de777b57539c1c1a8f0c7c3\" returns successfully" Dec 12 17:25:43.166665 containerd[1562]: time="2025-12-12T17:25:43.166602630Z" level=info msg="StartContainer for \"141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc\" returns successfully" Dec 12 17:25:43.175717 kubelet[2418]: E1212 17:25:43.175670 2418 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.132.113:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.132.113:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:25:43.180920 kubelet[2418]: I1212 17:25:43.180878 2418 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:43.182353 kubelet[2418]: E1212 17:25:43.182317 2418 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.132.113:6443/api/v1/nodes\": dial tcp 46.224.132.113:6443: connect: connection refused" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:43.232991 containerd[1562]: time="2025-12-12T17:25:43.232417642Z" level=info msg="StartContainer for \"a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c\" returns successfully" Dec 12 17:25:43.414645 kubelet[2418]: E1212 17:25:43.414349 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:43.419324 kubelet[2418]: E1212 17:25:43.419298 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:43.420834 kubelet[2418]: E1212 17:25:43.420685 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:43.984882 kubelet[2418]: I1212 17:25:43.984430 2418 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:44.423139 kubelet[2418]: E1212 17:25:44.422951 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:44.423482 kubelet[2418]: E1212 17:25:44.423463 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:45.425594 kubelet[2418]: E1212 17:25:45.425202 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:45.428131 kubelet[2418]: E1212 17:25:45.425537 2418 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.203261 kubelet[2418]: E1212 17:25:46.203209 2418 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-0-24adfa6772\" not found" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.293252 kubelet[2418]: I1212 17:25:46.292337 2418 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.293252 kubelet[2418]: E1212 17:25:46.292375 2418 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-0-24adfa6772\": node \"ci-4459-2-2-0-24adfa6772\" not found" Dec 12 17:25:46.306568 kubelet[2418]: E1212 17:25:46.304365 2418 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4459-2-2-0-24adfa6772.188087c6eb12aadd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-0-24adfa6772,UID:ci-4459-2-2-0-24adfa6772,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-0-24adfa6772,},FirstTimestamp:2025-12-12 17:25:42.346377949 +0000 UTC m=+0.423114646,LastTimestamp:2025-12-12 17:25:42.346377949 +0000 UTC m=+0.423114646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-0-24adfa6772,}" Dec 12 17:25:46.345224 kubelet[2418]: I1212 17:25:46.344906 2418 apiserver.go:52] "Watching apiserver" Dec 12 17:25:46.365095 kubelet[2418]: I1212 17:25:46.365009 2418 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.368104 kubelet[2418]: I1212 17:25:46.368058 2418 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:25:46.373207 kubelet[2418]: E1212 17:25:46.373166 2418 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.373207 kubelet[2418]: I1212 17:25:46.373199 2418 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.376087 kubelet[2418]: E1212 17:25:46.375476 2418 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.376087 kubelet[2418]: I1212 17:25:46.375502 2418 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:46.378796 kubelet[2418]: E1212 17:25:46.378769 2418 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-0-24adfa6772\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:47.545036 kubelet[2418]: I1212 17:25:47.544918 2418 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:48.380492 systemd[1]: Reload requested from client PID 2699 ('systemctl') (unit session-7.scope)... Dec 12 17:25:48.380510 systemd[1]: Reloading... Dec 12 17:25:48.495001 zram_generator::config[2743]: No configuration found. Dec 12 17:25:48.716677 systemd[1]: Reloading finished in 335 ms. Dec 12 17:25:48.745964 kubelet[2418]: I1212 17:25:48.745920 2418 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:48.746553 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:48.765855 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:25:48.766787 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:48.767116 systemd[1]: kubelet.service: Consumed 890ms CPU time, 126.1M memory peak. Dec 12 17:25:48.772900 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:48.944027 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:48.960348 (kubelet)[2788]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:25:49.012193 kubelet[2788]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:49.012193 kubelet[2788]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:25:49.012193 kubelet[2788]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:49.012534 kubelet[2788]: I1212 17:25:49.012218 2788 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:25:49.023555 kubelet[2788]: I1212 17:25:49.023503 2788 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 17:25:49.023555 kubelet[2788]: I1212 17:25:49.023538 2788 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:25:49.024027 kubelet[2788]: I1212 17:25:49.023786 2788 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:25:49.025567 kubelet[2788]: I1212 17:25:49.025514 2788 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 17:25:49.028973 kubelet[2788]: I1212 17:25:49.028891 2788 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:49.037785 kubelet[2788]: I1212 17:25:49.037756 2788 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:25:49.043245 kubelet[2788]: I1212 17:25:49.043201 2788 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:25:49.043512 kubelet[2788]: I1212 17:25:49.043429 2788 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:25:49.043797 kubelet[2788]: I1212 17:25:49.043518 2788 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-0-24adfa6772","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:25:49.043901 kubelet[2788]: I1212 17:25:49.043809 2788 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:25:49.043901 kubelet[2788]: I1212 17:25:49.043819 2788 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 17:25:49.045961 kubelet[2788]: I1212 17:25:49.045938 2788 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:49.046166 kubelet[2788]: I1212 17:25:49.046152 2788 kubelet.go:480] "Attempting to sync node with API server" Dec 12 17:25:49.046206 kubelet[2788]: I1212 17:25:49.046169 2788 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:25:49.046206 kubelet[2788]: I1212 17:25:49.046196 2788 kubelet.go:386] "Adding apiserver pod source" Dec 12 17:25:49.046247 kubelet[2788]: I1212 17:25:49.046211 2788 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:25:49.053118 kubelet[2788]: I1212 17:25:49.053079 2788 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:25:49.053706 kubelet[2788]: I1212 17:25:49.053640 2788 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:25:49.056226 kubelet[2788]: I1212 17:25:49.056183 2788 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:25:49.056315 kubelet[2788]: I1212 17:25:49.056236 2788 server.go:1289] "Started kubelet" Dec 12 17:25:49.059526 kubelet[2788]: I1212 17:25:49.059362 2788 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:25:49.064819 kubelet[2788]: I1212 17:25:49.064761 2788 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:25:49.068878 kubelet[2788]: I1212 17:25:49.067789 2788 server.go:317] "Adding debug handlers to kubelet server" Dec 12 17:25:49.072541 kubelet[2788]: I1212 17:25:49.072447 2788 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:25:49.072712 kubelet[2788]: I1212 17:25:49.072692 2788 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:25:49.073217 kubelet[2788]: I1212 17:25:49.073144 2788 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:25:49.076215 kubelet[2788]: I1212 17:25:49.076192 2788 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:25:49.076624 kubelet[2788]: E1212 17:25:49.076581 2788 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-0-24adfa6772\" not found" Dec 12 17:25:49.078389 kubelet[2788]: I1212 17:25:49.077965 2788 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:25:49.078389 kubelet[2788]: I1212 17:25:49.078082 2788 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:25:49.095222 kubelet[2788]: I1212 17:25:49.095183 2788 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:25:49.095334 kubelet[2788]: I1212 17:25:49.095306 2788 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:25:49.099043 kubelet[2788]: I1212 17:25:49.099010 2788 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 17:25:49.103164 kubelet[2788]: I1212 17:25:49.103134 2788 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 17:25:49.103301 kubelet[2788]: I1212 17:25:49.103291 2788 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 17:25:49.103368 kubelet[2788]: I1212 17:25:49.103359 2788 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:25:49.103731 kubelet[2788]: I1212 17:25:49.103422 2788 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 17:25:49.103731 kubelet[2788]: E1212 17:25:49.103467 2788 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:25:49.113400 kubelet[2788]: I1212 17:25:49.113364 2788 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:25:49.171632 kubelet[2788]: I1212 17:25:49.171606 2788 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:25:49.171929 kubelet[2788]: I1212 17:25:49.171900 2788 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.171998 2788 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172134 2788 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172146 2788 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172164 2788 policy_none.go:49] "None policy: Start" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172174 2788 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172181 2788 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:25:49.172849 kubelet[2788]: I1212 17:25:49.172262 2788 state_mem.go:75] "Updated machine memory state" Dec 12 17:25:49.177505 kubelet[2788]: E1212 17:25:49.177483 2788 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:25:49.178231 kubelet[2788]: I1212 17:25:49.178132 2788 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:25:49.178348 kubelet[2788]: I1212 17:25:49.178316 2788 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:25:49.179698 kubelet[2788]: I1212 17:25:49.179679 2788 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:25:49.187386 kubelet[2788]: E1212 17:25:49.187325 2788 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:25:49.204375 kubelet[2788]: I1212 17:25:49.204330 2788 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.204906 kubelet[2788]: I1212 17:25:49.204856 2788 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.205017 kubelet[2788]: I1212 17:25:49.204524 2788 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.217492 kubelet[2788]: E1212 17:25:49.217437 2788 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.291565 kubelet[2788]: I1212 17:25:49.291425 2788 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.307574 kubelet[2788]: I1212 17:25:49.307524 2788 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.308200 kubelet[2788]: I1212 17:25:49.308174 2788 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379744 kubelet[2788]: I1212 17:25:49.379426 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379744 kubelet[2788]: I1212 17:25:49.379478 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379744 kubelet[2788]: I1212 17:25:49.379511 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379744 kubelet[2788]: I1212 17:25:49.379528 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/830538c03d26db236d04c07f303f646f-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" (UID: \"830538c03d26db236d04c07f303f646f\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379744 kubelet[2788]: I1212 17:25:49.379545 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379991 kubelet[2788]: I1212 17:25:49.379561 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379991 kubelet[2788]: I1212 17:25:49.379576 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379991 kubelet[2788]: I1212 17:25:49.379591 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/80e3df58bcbdf1a5a7f49dc84715f1dc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-0-24adfa6772\" (UID: \"80e3df58bcbdf1a5a7f49dc84715f1dc\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:49.379991 kubelet[2788]: I1212 17:25:49.379608 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8ab144787e02705192adb6507b00eca1-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-0-24adfa6772\" (UID: \"8ab144787e02705192adb6507b00eca1\") " pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:50.048577 kubelet[2788]: I1212 17:25:50.048527 2788 apiserver.go:52] "Watching apiserver" Dec 12 17:25:50.078259 kubelet[2788]: I1212 17:25:50.078197 2788 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:25:50.153612 kubelet[2788]: I1212 17:25:50.153558 2788 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:50.154520 kubelet[2788]: I1212 17:25:50.154458 2788 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:50.161901 kubelet[2788]: E1212 17:25:50.161846 2788 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-0-24adfa6772\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:50.167989 kubelet[2788]: E1212 17:25:50.167942 2788 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-0-24adfa6772\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" Dec 12 17:25:50.197045 kubelet[2788]: I1212 17:25:50.196972 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-0-24adfa6772" podStartSLOduration=1.196939444 podStartE2EDuration="1.196939444s" podCreationTimestamp="2025-12-12 17:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:50.17907288 +0000 UTC m=+1.211166645" watchObservedRunningTime="2025-12-12 17:25:50.196939444 +0000 UTC m=+1.229033249" Dec 12 17:25:50.197232 kubelet[2788]: I1212 17:25:50.197129 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-0-24adfa6772" podStartSLOduration=1.197123244 podStartE2EDuration="1.197123244s" podCreationTimestamp="2025-12-12 17:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:50.194930794 +0000 UTC m=+1.227024559" watchObservedRunningTime="2025-12-12 17:25:50.197123244 +0000 UTC m=+1.229217049" Dec 12 17:25:50.215908 kubelet[2788]: I1212 17:25:50.215778 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-24adfa6772" podStartSLOduration=3.215739252 podStartE2EDuration="3.215739252s" podCreationTimestamp="2025-12-12 17:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:50.215612651 +0000 UTC m=+1.247706536" watchObservedRunningTime="2025-12-12 17:25:50.215739252 +0000 UTC m=+1.247833177" Dec 12 17:25:52.976741 kubelet[2788]: I1212 17:25:52.976297 2788 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:25:52.977725 containerd[1562]: time="2025-12-12T17:25:52.977582583Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:25:52.978755 kubelet[2788]: I1212 17:25:52.978439 2788 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:25:53.927338 systemd[1]: Created slice kubepods-besteffort-pod3a80f3ac_2862_4c05_bbd6_5ac17c4cd17e.slice - libcontainer container kubepods-besteffort-pod3a80f3ac_2862_4c05_bbd6_5ac17c4cd17e.slice. Dec 12 17:25:54.006180 kubelet[2788]: I1212 17:25:54.006112 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e-kube-proxy\") pod \"kube-proxy-fjnhb\" (UID: \"3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e\") " pod="kube-system/kube-proxy-fjnhb" Dec 12 17:25:54.006621 kubelet[2788]: I1212 17:25:54.006225 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e-xtables-lock\") pod \"kube-proxy-fjnhb\" (UID: \"3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e\") " pod="kube-system/kube-proxy-fjnhb" Dec 12 17:25:54.006621 kubelet[2788]: I1212 17:25:54.006271 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e-lib-modules\") pod \"kube-proxy-fjnhb\" (UID: \"3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e\") " pod="kube-system/kube-proxy-fjnhb" Dec 12 17:25:54.006621 kubelet[2788]: I1212 17:25:54.006317 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrq5h\" (UniqueName: \"kubernetes.io/projected/3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e-kube-api-access-hrq5h\") pod \"kube-proxy-fjnhb\" (UID: \"3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e\") " pod="kube-system/kube-proxy-fjnhb" Dec 12 17:25:54.235256 systemd[1]: Created slice kubepods-besteffort-podbb5b2122_6a8c_4571_8d4b_1fa5763761fb.slice - libcontainer container kubepods-besteffort-podbb5b2122_6a8c_4571_8d4b_1fa5763761fb.slice. Dec 12 17:25:54.238518 containerd[1562]: time="2025-12-12T17:25:54.238479463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-fjnhb,Uid:3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:54.272143 containerd[1562]: time="2025-12-12T17:25:54.272081447Z" level=info msg="connecting to shim fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81" address="unix:///run/containerd/s/c62e310af4e9d8aa9cc3ae607c5b77c824e14cf88fc9f461eb90cabe89ff05e2" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:54.307272 systemd[1]: Started cri-containerd-fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81.scope - libcontainer container fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81. Dec 12 17:25:54.308896 kubelet[2788]: I1212 17:25:54.308822 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bb5b2122-6a8c-4571-8d4b-1fa5763761fb-var-lib-calico\") pod \"tigera-operator-7dcd859c48-s92g9\" (UID: \"bb5b2122-6a8c-4571-8d4b-1fa5763761fb\") " pod="tigera-operator/tigera-operator-7dcd859c48-s92g9" Dec 12 17:25:54.309247 kubelet[2788]: I1212 17:25:54.309044 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgnf\" (UniqueName: \"kubernetes.io/projected/bb5b2122-6a8c-4571-8d4b-1fa5763761fb-kube-api-access-rjgnf\") pod \"tigera-operator-7dcd859c48-s92g9\" (UID: \"bb5b2122-6a8c-4571-8d4b-1fa5763761fb\") " pod="tigera-operator/tigera-operator-7dcd859c48-s92g9" Dec 12 17:25:54.337803 containerd[1562]: time="2025-12-12T17:25:54.337691570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-fjnhb,Uid:3a80f3ac-2862-4c05-bbd6-5ac17c4cd17e,Namespace:kube-system,Attempt:0,} returns sandbox id \"fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81\"" Dec 12 17:25:54.348259 containerd[1562]: time="2025-12-12T17:25:54.348176975Z" level=info msg="CreateContainer within sandbox \"fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:25:54.368941 containerd[1562]: time="2025-12-12T17:25:54.368041301Z" level=info msg="Container 384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:54.371169 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2423447999.mount: Deactivated successfully. Dec 12 17:25:54.381291 containerd[1562]: time="2025-12-12T17:25:54.381228397Z" level=info msg="CreateContainer within sandbox \"fc039a8e0cf5298f18e85df7ee3fe1239cdb0c40fc8ef8657090cc1834f6dd81\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295\"" Dec 12 17:25:54.385105 containerd[1562]: time="2025-12-12T17:25:54.384898733Z" level=info msg="StartContainer for \"384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295\"" Dec 12 17:25:54.387880 containerd[1562]: time="2025-12-12T17:25:54.387810586Z" level=info msg="connecting to shim 384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295" address="unix:///run/containerd/s/c62e310af4e9d8aa9cc3ae607c5b77c824e14cf88fc9f461eb90cabe89ff05e2" protocol=ttrpc version=3 Dec 12 17:25:54.405112 systemd[1]: Started cri-containerd-384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295.scope - libcontainer container 384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295. Dec 12 17:25:54.481609 containerd[1562]: time="2025-12-12T17:25:54.481548589Z" level=info msg="StartContainer for \"384f03b42b46833808436c95c843f998b6fa3830e902eba7a54090beec19c295\" returns successfully" Dec 12 17:25:54.545002 containerd[1562]: time="2025-12-12T17:25:54.544916782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-s92g9,Uid:bb5b2122-6a8c-4571-8d4b-1fa5763761fb,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:25:54.562180 containerd[1562]: time="2025-12-12T17:25:54.562128136Z" level=info msg="connecting to shim 0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23" address="unix:///run/containerd/s/26cb6ef1cec77fd5d8571867ad212ac16219331571f135ec2574cf455b244f95" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:54.596215 systemd[1]: Started cri-containerd-0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23.scope - libcontainer container 0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23. Dec 12 17:25:54.646171 containerd[1562]: time="2025-12-12T17:25:54.646120658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-s92g9,Uid:bb5b2122-6a8c-4571-8d4b-1fa5763761fb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23\"" Dec 12 17:25:54.649634 containerd[1562]: time="2025-12-12T17:25:54.649606193Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:25:56.337579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4037962324.mount: Deactivated successfully. Dec 12 17:25:56.739900 containerd[1562]: time="2025-12-12T17:25:56.739716152Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:56.741450 containerd[1562]: time="2025-12-12T17:25:56.741197038Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:25:56.742224 containerd[1562]: time="2025-12-12T17:25:56.742189322Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:56.745833 containerd[1562]: time="2025-12-12T17:25:56.745774377Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:56.748438 containerd[1562]: time="2025-12-12T17:25:56.748389468Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.098689395s" Dec 12 17:25:56.748663 containerd[1562]: time="2025-12-12T17:25:56.748556429Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:25:56.754955 containerd[1562]: time="2025-12-12T17:25:56.754852575Z" level=info msg="CreateContainer within sandbox \"0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:25:56.770609 containerd[1562]: time="2025-12-12T17:25:56.769619396Z" level=info msg="Container eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:56.770153 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4059335964.mount: Deactivated successfully. Dec 12 17:25:56.780951 containerd[1562]: time="2025-12-12T17:25:56.780903203Z" level=info msg="CreateContainer within sandbox \"0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49\"" Dec 12 17:25:56.783368 containerd[1562]: time="2025-12-12T17:25:56.783291333Z" level=info msg="StartContainer for \"eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49\"" Dec 12 17:25:56.784464 containerd[1562]: time="2025-12-12T17:25:56.784418297Z" level=info msg="connecting to shim eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49" address="unix:///run/containerd/s/26cb6ef1cec77fd5d8571867ad212ac16219331571f135ec2574cf455b244f95" protocol=ttrpc version=3 Dec 12 17:25:56.809214 systemd[1]: Started cri-containerd-eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49.scope - libcontainer container eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49. Dec 12 17:25:56.848161 containerd[1562]: time="2025-12-12T17:25:56.848041561Z" level=info msg="StartContainer for \"eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49\" returns successfully" Dec 12 17:25:57.195848 kubelet[2788]: I1212 17:25:57.195507 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-fjnhb" podStartSLOduration=4.195487188 podStartE2EDuration="4.195487188s" podCreationTimestamp="2025-12-12 17:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:55.20338016 +0000 UTC m=+6.235473965" watchObservedRunningTime="2025-12-12 17:25:57.195487188 +0000 UTC m=+8.227581033" Dec 12 17:25:57.197949 kubelet[2788]: I1212 17:25:57.196611 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-s92g9" podStartSLOduration=1.095550827 podStartE2EDuration="3.196589992s" podCreationTimestamp="2025-12-12 17:25:54 +0000 UTC" firstStartedPulling="2025-12-12 17:25:54.648330187 +0000 UTC m=+5.680423992" lastFinishedPulling="2025-12-12 17:25:56.749369352 +0000 UTC m=+7.781463157" observedRunningTime="2025-12-12 17:25:57.196388711 +0000 UTC m=+8.228482516" watchObservedRunningTime="2025-12-12 17:25:57.196589992 +0000 UTC m=+8.228683797" Dec 12 17:26:03.208406 sudo[1855]: pam_unix(sudo:session): session closed for user root Dec 12 17:26:03.368100 sshd[1854]: Connection closed by 139.178.89.65 port 48274 Dec 12 17:26:03.368754 sshd-session[1851]: pam_unix(sshd:session): session closed for user core Dec 12 17:26:03.375462 systemd[1]: sshd@6-46.224.132.113:22-139.178.89.65:48274.service: Deactivated successfully. Dec 12 17:26:03.381763 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:26:03.381986 systemd[1]: session-7.scope: Consumed 6.758s CPU time, 223.9M memory peak. Dec 12 17:26:03.385064 systemd-logind[1523]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:26:03.387385 systemd-logind[1523]: Removed session 7. Dec 12 17:26:14.359257 systemd[1]: Created slice kubepods-besteffort-pod722e7417_170f_4e93_b85a_e847658786d5.slice - libcontainer container kubepods-besteffort-pod722e7417_170f_4e93_b85a_e847658786d5.slice. Dec 12 17:26:14.439319 kubelet[2788]: I1212 17:26:14.439259 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/722e7417-170f-4e93-b85a-e847658786d5-typha-certs\") pod \"calico-typha-785cf97c86-fmvzt\" (UID: \"722e7417-170f-4e93-b85a-e847658786d5\") " pod="calico-system/calico-typha-785cf97c86-fmvzt" Dec 12 17:26:14.440182 kubelet[2788]: I1212 17:26:14.439930 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722e7417-170f-4e93-b85a-e847658786d5-tigera-ca-bundle\") pod \"calico-typha-785cf97c86-fmvzt\" (UID: \"722e7417-170f-4e93-b85a-e847658786d5\") " pod="calico-system/calico-typha-785cf97c86-fmvzt" Dec 12 17:26:14.440182 kubelet[2788]: I1212 17:26:14.440003 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntjd\" (UniqueName: \"kubernetes.io/projected/722e7417-170f-4e93-b85a-e847658786d5-kube-api-access-tntjd\") pod \"calico-typha-785cf97c86-fmvzt\" (UID: \"722e7417-170f-4e93-b85a-e847658786d5\") " pod="calico-system/calico-typha-785cf97c86-fmvzt" Dec 12 17:26:14.585410 systemd[1]: Created slice kubepods-besteffort-podf2550c2f_eb3e_4e98_85b4_76a4e9e8a17a.slice - libcontainer container kubepods-besteffort-podf2550c2f_eb3e_4e98_85b4_76a4e9e8a17a.slice. Dec 12 17:26:14.641728 kubelet[2788]: I1212 17:26:14.641379 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-tigera-ca-bundle\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.641728 kubelet[2788]: I1212 17:26:14.641521 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-xtables-lock\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.641728 kubelet[2788]: I1212 17:26:14.641575 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-policysync\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.641728 kubelet[2788]: I1212 17:26:14.641612 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-var-lib-calico\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.641728 kubelet[2788]: I1212 17:26:14.641646 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-cni-bin-dir\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642018 kubelet[2788]: I1212 17:26:14.641691 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-cni-net-dir\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642018 kubelet[2788]: I1212 17:26:14.641724 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-node-certs\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642018 kubelet[2788]: I1212 17:26:14.641754 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-cni-log-dir\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642018 kubelet[2788]: I1212 17:26:14.641782 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-flexvol-driver-host\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642018 kubelet[2788]: I1212 17:26:14.641816 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-var-run-calico\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642134 kubelet[2788]: I1212 17:26:14.641851 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf84x\" (UniqueName: \"kubernetes.io/projected/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-kube-api-access-sf84x\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.642134 kubelet[2788]: I1212 17:26:14.641947 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a-lib-modules\") pod \"calico-node-ds6dz\" (UID: \"f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a\") " pod="calico-system/calico-node-ds6dz" Dec 12 17:26:14.665214 containerd[1562]: time="2025-12-12T17:26:14.665000606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-785cf97c86-fmvzt,Uid:722e7417-170f-4e93-b85a-e847658786d5,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:14.694190 containerd[1562]: time="2025-12-12T17:26:14.694060225Z" level=info msg="connecting to shim 27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374" address="unix:///run/containerd/s/81fc94be22c429bad3a704d64bd8c879dea0d32ae975406a6de2cb52176c8353" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:14.735142 systemd[1]: Started cri-containerd-27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374.scope - libcontainer container 27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374. Dec 12 17:26:14.748624 kubelet[2788]: E1212 17:26:14.747990 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.748624 kubelet[2788]: W1212 17:26:14.748041 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.748624 kubelet[2788]: E1212 17:26:14.748069 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.749197 kubelet[2788]: E1212 17:26:14.749004 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.749197 kubelet[2788]: W1212 17:26:14.749041 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.749197 kubelet[2788]: E1212 17:26:14.749058 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.750239 kubelet[2788]: E1212 17:26:14.749447 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.750239 kubelet[2788]: W1212 17:26:14.749460 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.750239 kubelet[2788]: E1212 17:26:14.749474 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.754052 kubelet[2788]: E1212 17:26:14.750785 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.754052 kubelet[2788]: W1212 17:26:14.750806 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.754052 kubelet[2788]: E1212 17:26:14.751487 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.754192 kubelet[2788]: E1212 17:26:14.754105 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.754501 kubelet[2788]: W1212 17:26:14.754122 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.754501 kubelet[2788]: E1212 17:26:14.754253 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.756180 kubelet[2788]: E1212 17:26:14.756048 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.756180 kubelet[2788]: W1212 17:26:14.756076 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.756284 kubelet[2788]: E1212 17:26:14.756190 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.758967 kubelet[2788]: E1212 17:26:14.758939 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.758967 kubelet[2788]: W1212 17:26:14.758963 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.759306 kubelet[2788]: E1212 17:26:14.758981 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.760320 kubelet[2788]: E1212 17:26:14.760222 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.760544 kubelet[2788]: W1212 17:26:14.760391 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.760684 kubelet[2788]: E1212 17:26:14.760608 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.762399 kubelet[2788]: E1212 17:26:14.762341 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.762399 kubelet[2788]: W1212 17:26:14.762357 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.762399 kubelet[2788]: E1212 17:26:14.762371 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.763334 kubelet[2788]: E1212 17:26:14.762727 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.763334 kubelet[2788]: W1212 17:26:14.762743 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.763611 kubelet[2788]: E1212 17:26:14.763483 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.763725 kubelet[2788]: E1212 17:26:14.763713 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.763795 kubelet[2788]: W1212 17:26:14.763784 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.763855 kubelet[2788]: E1212 17:26:14.763840 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.764900 kubelet[2788]: E1212 17:26:14.764263 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.765006 kubelet[2788]: W1212 17:26:14.764989 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.765075 kubelet[2788]: E1212 17:26:14.765064 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.765390 kubelet[2788]: E1212 17:26:14.765319 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.765390 kubelet[2788]: W1212 17:26:14.765330 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.765390 kubelet[2788]: E1212 17:26:14.765341 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.767265 kubelet[2788]: E1212 17:26:14.767185 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.767458 kubelet[2788]: W1212 17:26:14.767385 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.767710 kubelet[2788]: E1212 17:26:14.767670 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.769051 kubelet[2788]: E1212 17:26:14.769015 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.769051 kubelet[2788]: W1212 17:26:14.769036 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.769051 kubelet[2788]: E1212 17:26:14.769050 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.781099 kubelet[2788]: E1212 17:26:14.781052 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:14.822171 kubelet[2788]: E1212 17:26:14.822141 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.822171 kubelet[2788]: W1212 17:26:14.822163 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.822473 kubelet[2788]: E1212 17:26:14.822184 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.822473 kubelet[2788]: E1212 17:26:14.822298 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.822473 kubelet[2788]: W1212 17:26:14.822304 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.822473 kubelet[2788]: E1212 17:26:14.822341 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.822473 kubelet[2788]: E1212 17:26:14.822456 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.822473 kubelet[2788]: W1212 17:26:14.822464 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.822473 kubelet[2788]: E1212 17:26:14.822472 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.822624 kubelet[2788]: E1212 17:26:14.822588 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.822624 kubelet[2788]: W1212 17:26:14.822595 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.822624 kubelet[2788]: E1212 17:26:14.822603 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822711 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.824007 kubelet[2788]: W1212 17:26:14.822726 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822735 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822848 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.824007 kubelet[2788]: W1212 17:26:14.822855 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822877 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822976 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.824007 kubelet[2788]: W1212 17:26:14.822982 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.822989 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.824007 kubelet[2788]: E1212 17:26:14.823078 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825253 kubelet[2788]: W1212 17:26:14.823084 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823090 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823188 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825253 kubelet[2788]: W1212 17:26:14.823195 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823201 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823296 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825253 kubelet[2788]: W1212 17:26:14.823302 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823310 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825253 kubelet[2788]: E1212 17:26:14.823464 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825253 kubelet[2788]: W1212 17:26:14.823473 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823481 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823621 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825482 kubelet[2788]: W1212 17:26:14.823628 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823636 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823765 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825482 kubelet[2788]: W1212 17:26:14.823772 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823780 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823974 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825482 kubelet[2788]: W1212 17:26:14.823984 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825482 kubelet[2788]: E1212 17:26:14.823994 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824103 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825685 kubelet[2788]: W1212 17:26:14.824111 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824118 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824228 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825685 kubelet[2788]: W1212 17:26:14.824242 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824249 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824377 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.825685 kubelet[2788]: W1212 17:26:14.824384 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824393 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.825685 kubelet[2788]: E1212 17:26:14.824517 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.826680 kubelet[2788]: W1212 17:26:14.824524 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.826680 kubelet[2788]: E1212 17:26:14.824531 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.826680 kubelet[2788]: E1212 17:26:14.824630 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.826680 kubelet[2788]: W1212 17:26:14.824636 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.826680 kubelet[2788]: E1212 17:26:14.824644 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.826680 kubelet[2788]: E1212 17:26:14.824742 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.826680 kubelet[2788]: W1212 17:26:14.824751 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.826680 kubelet[2788]: E1212 17:26:14.824758 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.845203 kubelet[2788]: E1212 17:26:14.845141 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.845203 kubelet[2788]: W1212 17:26:14.845167 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.845203 kubelet[2788]: E1212 17:26:14.845188 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.845655 kubelet[2788]: I1212 17:26:14.845605 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b3554d30-e274-4b18-8389-696d2cc03c37-registration-dir\") pod \"csi-node-driver-qj8f4\" (UID: \"b3554d30-e274-4b18-8389-696d2cc03c37\") " pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:14.846596 kubelet[2788]: E1212 17:26:14.846555 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.846596 kubelet[2788]: W1212 17:26:14.846581 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.846596 kubelet[2788]: E1212 17:26:14.846597 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.847339 kubelet[2788]: I1212 17:26:14.847305 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b3554d30-e274-4b18-8389-696d2cc03c37-varrun\") pod \"csi-node-driver-qj8f4\" (UID: \"b3554d30-e274-4b18-8389-696d2cc03c37\") " pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:14.847683 kubelet[2788]: E1212 17:26:14.847629 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.847683 kubelet[2788]: W1212 17:26:14.847650 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.848090 kubelet[2788]: E1212 17:26:14.847664 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.848879 kubelet[2788]: E1212 17:26:14.848822 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.848879 kubelet[2788]: W1212 17:26:14.848841 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.848879 kubelet[2788]: E1212 17:26:14.848854 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.849913 kubelet[2788]: E1212 17:26:14.849883 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.849913 kubelet[2788]: W1212 17:26:14.849909 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.850001 kubelet[2788]: E1212 17:26:14.849925 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.850108 kubelet[2788]: I1212 17:26:14.850048 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b3554d30-e274-4b18-8389-696d2cc03c37-kubelet-dir\") pod \"csi-node-driver-qj8f4\" (UID: \"b3554d30-e274-4b18-8389-696d2cc03c37\") " pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:14.850737 kubelet[2788]: E1212 17:26:14.850693 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.850737 kubelet[2788]: W1212 17:26:14.850716 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.850737 kubelet[2788]: E1212 17:26:14.850728 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.851404 kubelet[2788]: E1212 17:26:14.851379 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.851585 kubelet[2788]: W1212 17:26:14.851399 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.851585 kubelet[2788]: E1212 17:26:14.851474 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.852033 kubelet[2788]: E1212 17:26:14.851986 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.852033 kubelet[2788]: W1212 17:26:14.852005 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.852033 kubelet[2788]: E1212 17:26:14.852017 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.852334 kubelet[2788]: I1212 17:26:14.852298 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b3554d30-e274-4b18-8389-696d2cc03c37-socket-dir\") pod \"csi-node-driver-qj8f4\" (UID: \"b3554d30-e274-4b18-8389-696d2cc03c37\") " pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:14.853663 kubelet[2788]: E1212 17:26:14.853580 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.853663 kubelet[2788]: W1212 17:26:14.853604 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.853663 kubelet[2788]: E1212 17:26:14.853617 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.854111 kubelet[2788]: I1212 17:26:14.854077 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrv7\" (UniqueName: \"kubernetes.io/projected/b3554d30-e274-4b18-8389-696d2cc03c37-kube-api-access-6lrv7\") pod \"csi-node-driver-qj8f4\" (UID: \"b3554d30-e274-4b18-8389-696d2cc03c37\") " pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:14.855035 kubelet[2788]: E1212 17:26:14.855007 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.855035 kubelet[2788]: W1212 17:26:14.855025 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.855035 kubelet[2788]: E1212 17:26:14.855039 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.855880 containerd[1562]: time="2025-12-12T17:26:14.855530030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-785cf97c86-fmvzt,Uid:722e7417-170f-4e93-b85a-e847658786d5,Namespace:calico-system,Attempt:0,} returns sandbox id \"27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374\"" Dec 12 17:26:14.855975 kubelet[2788]: E1212 17:26:14.855926 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.855975 kubelet[2788]: W1212 17:26:14.855938 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.855975 kubelet[2788]: E1212 17:26:14.855949 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.857608 kubelet[2788]: E1212 17:26:14.857541 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.857608 kubelet[2788]: W1212 17:26:14.857559 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.857608 kubelet[2788]: E1212 17:26:14.857571 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.857957 kubelet[2788]: E1212 17:26:14.857761 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.857957 kubelet[2788]: W1212 17:26:14.857776 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.857957 kubelet[2788]: E1212 17:26:14.857785 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.859732 containerd[1562]: time="2025-12-12T17:26:14.859688599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:26:14.860126 kubelet[2788]: E1212 17:26:14.860091 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.860126 kubelet[2788]: W1212 17:26:14.860114 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.860126 kubelet[2788]: E1212 17:26:14.860129 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.860868 kubelet[2788]: E1212 17:26:14.860833 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.860868 kubelet[2788]: W1212 17:26:14.860849 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.860868 kubelet[2788]: E1212 17:26:14.860882 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.890028 containerd[1562]: time="2025-12-12T17:26:14.889969098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ds6dz,Uid:f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:14.919493 containerd[1562]: time="2025-12-12T17:26:14.918336848Z" level=info msg="connecting to shim c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479" address="unix:///run/containerd/s/a982f26370673afd9bf15d207ae8b71239526c0cbfbd33a0bcb11acf38099a9c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:14.945200 systemd[1]: Started cri-containerd-c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479.scope - libcontainer container c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479. Dec 12 17:26:14.955536 kubelet[2788]: E1212 17:26:14.955499 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.955536 kubelet[2788]: W1212 17:26:14.955523 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.955536 kubelet[2788]: E1212 17:26:14.955544 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.955994 kubelet[2788]: E1212 17:26:14.955970 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.955994 kubelet[2788]: W1212 17:26:14.955989 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.956221 kubelet[2788]: E1212 17:26:14.956003 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.956531 kubelet[2788]: E1212 17:26:14.956508 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.956531 kubelet[2788]: W1212 17:26:14.956529 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.956735 kubelet[2788]: E1212 17:26:14.956544 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.957146 kubelet[2788]: E1212 17:26:14.957101 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.957146 kubelet[2788]: W1212 17:26:14.957135 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.957330 kubelet[2788]: E1212 17:26:14.957166 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.957554 kubelet[2788]: E1212 17:26:14.957532 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.957554 kubelet[2788]: W1212 17:26:14.957550 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.957745 kubelet[2788]: E1212 17:26:14.957572 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.958096 kubelet[2788]: E1212 17:26:14.958074 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.958096 kubelet[2788]: W1212 17:26:14.958093 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.958277 kubelet[2788]: E1212 17:26:14.958107 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.958416 kubelet[2788]: E1212 17:26:14.958356 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.958416 kubelet[2788]: W1212 17:26:14.958366 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.958416 kubelet[2788]: E1212 17:26:14.958376 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.958774 kubelet[2788]: E1212 17:26:14.958642 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.958774 kubelet[2788]: W1212 17:26:14.958652 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.958774 kubelet[2788]: E1212 17:26:14.958662 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.959059 kubelet[2788]: E1212 17:26:14.958956 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.959059 kubelet[2788]: W1212 17:26:14.958966 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.959059 kubelet[2788]: E1212 17:26:14.958977 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.959566 kubelet[2788]: E1212 17:26:14.959511 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.959566 kubelet[2788]: W1212 17:26:14.959547 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.959670 kubelet[2788]: E1212 17:26:14.959561 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.960163 kubelet[2788]: E1212 17:26:14.960126 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.960163 kubelet[2788]: W1212 17:26:14.960144 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.960163 kubelet[2788]: E1212 17:26:14.960156 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.960558 kubelet[2788]: E1212 17:26:14.960535 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.960558 kubelet[2788]: W1212 17:26:14.960553 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.960749 kubelet[2788]: E1212 17:26:14.960566 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.960984 kubelet[2788]: E1212 17:26:14.960962 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.960984 kubelet[2788]: W1212 17:26:14.960978 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.961072 kubelet[2788]: E1212 17:26:14.960992 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.961229 kubelet[2788]: E1212 17:26:14.961205 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.961229 kubelet[2788]: W1212 17:26:14.961219 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.961229 kubelet[2788]: E1212 17:26:14.961230 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.962054 kubelet[2788]: E1212 17:26:14.962030 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.962054 kubelet[2788]: W1212 17:26:14.962048 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.962144 kubelet[2788]: E1212 17:26:14.962060 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.962266 kubelet[2788]: E1212 17:26:14.962231 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.962266 kubelet[2788]: W1212 17:26:14.962246 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.962266 kubelet[2788]: E1212 17:26:14.962255 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.962455 kubelet[2788]: E1212 17:26:14.962406 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.962455 kubelet[2788]: W1212 17:26:14.962414 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.962455 kubelet[2788]: E1212 17:26:14.962435 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.962629 kubelet[2788]: E1212 17:26:14.962602 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.962629 kubelet[2788]: W1212 17:26:14.962610 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.962629 kubelet[2788]: E1212 17:26:14.962617 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.963075 kubelet[2788]: E1212 17:26:14.962855 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.963075 kubelet[2788]: W1212 17:26:14.962876 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.963075 kubelet[2788]: E1212 17:26:14.962889 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.963341 kubelet[2788]: E1212 17:26:14.963310 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.963341 kubelet[2788]: W1212 17:26:14.963337 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.963528 kubelet[2788]: E1212 17:26:14.963349 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.963640 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.964620 kubelet[2788]: W1212 17:26:14.963651 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.963661 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.964178 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.964620 kubelet[2788]: W1212 17:26:14.964189 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.964216 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.964508 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.964620 kubelet[2788]: W1212 17:26:14.964518 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.964620 kubelet[2788]: E1212 17:26:14.964554 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.964900 kubelet[2788]: E1212 17:26:14.964765 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.964900 kubelet[2788]: W1212 17:26:14.964774 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.964900 kubelet[2788]: E1212 17:26:14.964797 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.965030 kubelet[2788]: E1212 17:26:14.965010 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.965074 kubelet[2788]: W1212 17:26:14.965025 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.965074 kubelet[2788]: E1212 17:26:14.965042 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.980758 kubelet[2788]: E1212 17:26:14.980724 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:14.980758 kubelet[2788]: W1212 17:26:14.980748 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:14.981026 kubelet[2788]: E1212 17:26:14.980783 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:14.993949 containerd[1562]: time="2025-12-12T17:26:14.993837756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ds6dz,Uid:f2550c2f-eb3e-4e98-85b4-76a4e9e8a17a,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\"" Dec 12 17:26:16.104643 kubelet[2788]: E1212 17:26:16.104565 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:16.329179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2362441189.mount: Deactivated successfully. Dec 12 17:26:18.105451 kubelet[2788]: E1212 17:26:18.105002 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:20.104730 kubelet[2788]: E1212 17:26:20.104327 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:21.666512 containerd[1562]: time="2025-12-12T17:26:21.665494386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.666512 containerd[1562]: time="2025-12-12T17:26:21.666460505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:26:21.667214 containerd[1562]: time="2025-12-12T17:26:21.667174003Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.668980 containerd[1562]: time="2025-12-12T17:26:21.668950788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.669491 containerd[1562]: time="2025-12-12T17:26:21.669458749Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 6.809727187s" Dec 12 17:26:21.669541 containerd[1562]: time="2025-12-12T17:26:21.669492792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:26:21.670925 containerd[1562]: time="2025-12-12T17:26:21.670900027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:26:21.686784 containerd[1562]: time="2025-12-12T17:26:21.686729277Z" level=info msg="CreateContainer within sandbox \"27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:26:21.703456 containerd[1562]: time="2025-12-12T17:26:21.702117772Z" level=info msg="Container 5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:21.705731 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2129770165.mount: Deactivated successfully. Dec 12 17:26:21.715146 containerd[1562]: time="2025-12-12T17:26:21.715084910Z" level=info msg="CreateContainer within sandbox \"27385e71d2601469ff3946278dc5c05cab1504904a6f0711f65e0fa196d91374\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f\"" Dec 12 17:26:21.715841 containerd[1562]: time="2025-12-12T17:26:21.715815449Z" level=info msg="StartContainer for \"5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f\"" Dec 12 17:26:21.717271 containerd[1562]: time="2025-12-12T17:26:21.717242446Z" level=info msg="connecting to shim 5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f" address="unix:///run/containerd/s/81fc94be22c429bad3a704d64bd8c879dea0d32ae975406a6de2cb52176c8353" protocol=ttrpc version=3 Dec 12 17:26:21.744113 systemd[1]: Started cri-containerd-5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f.scope - libcontainer container 5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f. Dec 12 17:26:21.795396 containerd[1562]: time="2025-12-12T17:26:21.795336534Z" level=info msg="StartContainer for \"5cdc4f9293678b679fb3b4af6421c0b782f6b8f7e3744f9b760083260d49203f\" returns successfully" Dec 12 17:26:22.104604 kubelet[2788]: E1212 17:26:22.104548 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:22.272075 kubelet[2788]: E1212 17:26:22.271962 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.272075 kubelet[2788]: W1212 17:26:22.272021 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.272568 kubelet[2788]: E1212 17:26:22.272053 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.273022 kubelet[2788]: E1212 17:26:22.273000 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.273269 kubelet[2788]: W1212 17:26:22.273083 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.273479 kubelet[2788]: E1212 17:26:22.273327 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.274004 kubelet[2788]: E1212 17:26:22.273840 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.274004 kubelet[2788]: W1212 17:26:22.273898 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.274267 kubelet[2788]: E1212 17:26:22.274189 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.274892 kubelet[2788]: E1212 17:26:22.274801 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.275077 kubelet[2788]: W1212 17:26:22.274826 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.275077 kubelet[2788]: E1212 17:26:22.274959 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.275530 kubelet[2788]: E1212 17:26:22.275462 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.275530 kubelet[2788]: W1212 17:26:22.275474 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.275530 kubelet[2788]: E1212 17:26:22.275486 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.276753 kubelet[2788]: E1212 17:26:22.276565 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.276753 kubelet[2788]: W1212 17:26:22.276612 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.276753 kubelet[2788]: E1212 17:26:22.276627 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.278376 kubelet[2788]: E1212 17:26:22.278357 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.278618 kubelet[2788]: W1212 17:26:22.278464 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.278618 kubelet[2788]: E1212 17:26:22.278484 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.279105 kubelet[2788]: E1212 17:26:22.278878 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.279105 kubelet[2788]: W1212 17:26:22.278892 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.279105 kubelet[2788]: E1212 17:26:22.278903 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.279483 kubelet[2788]: E1212 17:26:22.279454 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.279807 kubelet[2788]: W1212 17:26:22.279736 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.279807 kubelet[2788]: E1212 17:26:22.279758 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.280416 kubelet[2788]: E1212 17:26:22.280301 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.280416 kubelet[2788]: W1212 17:26:22.280348 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.280751 kubelet[2788]: E1212 17:26:22.280682 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.281093 kubelet[2788]: E1212 17:26:22.281073 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.281295 kubelet[2788]: W1212 17:26:22.281190 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.281295 kubelet[2788]: E1212 17:26:22.281208 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.281678 kubelet[2788]: E1212 17:26:22.281516 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.281678 kubelet[2788]: W1212 17:26:22.281528 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.281678 kubelet[2788]: E1212 17:26:22.281544 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.282098 kubelet[2788]: E1212 17:26:22.282034 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.282271 kubelet[2788]: W1212 17:26:22.282185 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.282271 kubelet[2788]: E1212 17:26:22.282202 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.282679 kubelet[2788]: E1212 17:26:22.282627 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.282679 kubelet[2788]: W1212 17:26:22.282643 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.282679 kubelet[2788]: E1212 17:26:22.282654 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.283059 kubelet[2788]: E1212 17:26:22.283019 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.283059 kubelet[2788]: W1212 17:26:22.283033 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.283245 kubelet[2788]: E1212 17:26:22.283165 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.310745 kubelet[2788]: E1212 17:26:22.310690 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.310745 kubelet[2788]: W1212 17:26:22.310736 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.311270 kubelet[2788]: E1212 17:26:22.310771 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.311505 kubelet[2788]: E1212 17:26:22.311405 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.311505 kubelet[2788]: W1212 17:26:22.311449 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.311505 kubelet[2788]: E1212 17:26:22.311471 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.312088 kubelet[2788]: E1212 17:26:22.312058 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.312179 kubelet[2788]: W1212 17:26:22.312110 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.312179 kubelet[2788]: E1212 17:26:22.312137 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.312527 kubelet[2788]: E1212 17:26:22.312504 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.312527 kubelet[2788]: W1212 17:26:22.312518 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.312527 kubelet[2788]: E1212 17:26:22.312531 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.313136 kubelet[2788]: E1212 17:26:22.313118 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.313136 kubelet[2788]: W1212 17:26:22.313133 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.313282 kubelet[2788]: E1212 17:26:22.313146 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.313346 kubelet[2788]: E1212 17:26:22.313335 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.313346 kubelet[2788]: W1212 17:26:22.313344 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.313465 kubelet[2788]: E1212 17:26:22.313354 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.313519 kubelet[2788]: E1212 17:26:22.313507 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.313519 kubelet[2788]: W1212 17:26:22.313517 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.313621 kubelet[2788]: E1212 17:26:22.313529 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.313672 kubelet[2788]: E1212 17:26:22.313665 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.313727 kubelet[2788]: W1212 17:26:22.313672 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.313727 kubelet[2788]: E1212 17:26:22.313681 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.313996 kubelet[2788]: E1212 17:26:22.313818 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.313996 kubelet[2788]: W1212 17:26:22.313826 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.313996 kubelet[2788]: E1212 17:26:22.313834 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.314971 kubelet[2788]: E1212 17:26:22.314156 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.314971 kubelet[2788]: W1212 17:26:22.314170 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.314971 kubelet[2788]: E1212 17:26:22.314183 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.314971 kubelet[2788]: E1212 17:26:22.314591 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.314971 kubelet[2788]: W1212 17:26:22.314617 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.314971 kubelet[2788]: E1212 17:26:22.314643 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.315659 kubelet[2788]: E1212 17:26:22.315417 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.315659 kubelet[2788]: W1212 17:26:22.315442 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.315659 kubelet[2788]: E1212 17:26:22.315465 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.316190 kubelet[2788]: E1212 17:26:22.315787 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.316190 kubelet[2788]: W1212 17:26:22.315805 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.316190 kubelet[2788]: E1212 17:26:22.315827 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.316678 kubelet[2788]: E1212 17:26:22.316500 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.316678 kubelet[2788]: W1212 17:26:22.316523 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.316678 kubelet[2788]: E1212 17:26:22.316544 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.316995 kubelet[2788]: E1212 17:26:22.316975 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.317092 kubelet[2788]: W1212 17:26:22.317077 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.317158 kubelet[2788]: E1212 17:26:22.317145 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.317698 kubelet[2788]: E1212 17:26:22.317600 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.317698 kubelet[2788]: W1212 17:26:22.317618 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.317698 kubelet[2788]: E1212 17:26:22.317635 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.318403 kubelet[2788]: E1212 17:26:22.318356 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.318403 kubelet[2788]: W1212 17:26:22.318387 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.318403 kubelet[2788]: E1212 17:26:22.318405 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.322182 kubelet[2788]: E1212 17:26:22.322133 2788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.322403 kubelet[2788]: W1212 17:26:22.322362 2788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.322520 kubelet[2788]: E1212 17:26:22.322499 2788 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:23.092924 containerd[1562]: time="2025-12-12T17:26:23.092830918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.094446 containerd[1562]: time="2025-12-12T17:26:23.094384918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:26:23.095109 containerd[1562]: time="2025-12-12T17:26:23.095051970Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.098357 containerd[1562]: time="2025-12-12T17:26:23.098289580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.098856 containerd[1562]: time="2025-12-12T17:26:23.098669810Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.427627691s" Dec 12 17:26:23.098856 containerd[1562]: time="2025-12-12T17:26:23.098701052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:26:23.108015 containerd[1562]: time="2025-12-12T17:26:23.107911205Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:26:23.117096 containerd[1562]: time="2025-12-12T17:26:23.117026830Z" level=info msg="Container bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:23.130582 containerd[1562]: time="2025-12-12T17:26:23.130501193Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2\"" Dec 12 17:26:23.131211 containerd[1562]: time="2025-12-12T17:26:23.131180565Z" level=info msg="StartContainer for \"bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2\"" Dec 12 17:26:23.135120 containerd[1562]: time="2025-12-12T17:26:23.135070986Z" level=info msg="connecting to shim bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2" address="unix:///run/containerd/s/a982f26370673afd9bf15d207ae8b71239526c0cbfbd33a0bcb11acf38099a9c" protocol=ttrpc version=3 Dec 12 17:26:23.162130 systemd[1]: Started cri-containerd-bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2.scope - libcontainer container bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2. Dec 12 17:26:23.243156 containerd[1562]: time="2025-12-12T17:26:23.243058303Z" level=info msg="StartContainer for \"bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2\" returns successfully" Dec 12 17:26:23.259660 systemd[1]: cri-containerd-bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2.scope: Deactivated successfully. Dec 12 17:26:23.268156 containerd[1562]: time="2025-12-12T17:26:23.268096480Z" level=info msg="received container exit event container_id:\"bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2\" id:\"bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2\" pid:3479 exited_at:{seconds:1765560383 nanos:267628724}" Dec 12 17:26:23.269121 kubelet[2788]: I1212 17:26:23.269045 2788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:23.296491 kubelet[2788]: I1212 17:26:23.296056 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-785cf97c86-fmvzt" podStartSLOduration=2.484101061 podStartE2EDuration="9.296037442s" podCreationTimestamp="2025-12-12 17:26:14 +0000 UTC" firstStartedPulling="2025-12-12 17:26:14.858845836 +0000 UTC m=+25.890939641" lastFinishedPulling="2025-12-12 17:26:21.670782217 +0000 UTC m=+32.702876022" observedRunningTime="2025-12-12 17:26:22.280307329 +0000 UTC m=+33.312401134" watchObservedRunningTime="2025-12-12 17:26:23.296037442 +0000 UTC m=+34.328131247" Dec 12 17:26:23.304105 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf28c57dfed21d9a0e46eff849e62891809bf84339bde30a91cc588a3dcaebe2-rootfs.mount: Deactivated successfully. Dec 12 17:26:24.104133 kubelet[2788]: E1212 17:26:24.104052 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:24.279397 containerd[1562]: time="2025-12-12T17:26:24.279344459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:26:26.103720 kubelet[2788]: E1212 17:26:26.103628 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:26.898840 containerd[1562]: time="2025-12-12T17:26:26.898071709Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.899935 containerd[1562]: time="2025-12-12T17:26:26.899901840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:26:26.901233 containerd[1562]: time="2025-12-12T17:26:26.901204893Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.905039 containerd[1562]: time="2025-12-12T17:26:26.904980524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.905966 containerd[1562]: time="2025-12-12T17:26:26.905525443Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.626140741s" Dec 12 17:26:26.905966 containerd[1562]: time="2025-12-12T17:26:26.905570166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:26:26.917882 containerd[1562]: time="2025-12-12T17:26:26.917469458Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:26:26.928646 containerd[1562]: time="2025-12-12T17:26:26.927093347Z" level=info msg="Container fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:26.954997 containerd[1562]: time="2025-12-12T17:26:26.954923100Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42\"" Dec 12 17:26:26.956336 containerd[1562]: time="2025-12-12T17:26:26.956276037Z" level=info msg="StartContainer for \"fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42\"" Dec 12 17:26:26.958639 containerd[1562]: time="2025-12-12T17:26:26.958585682Z" level=info msg="connecting to shim fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42" address="unix:///run/containerd/s/a982f26370673afd9bf15d207ae8b71239526c0cbfbd33a0bcb11acf38099a9c" protocol=ttrpc version=3 Dec 12 17:26:26.990195 systemd[1]: Started cri-containerd-fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42.scope - libcontainer container fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42. Dec 12 17:26:27.083255 containerd[1562]: time="2025-12-12T17:26:27.083166778Z" level=info msg="StartContainer for \"fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42\" returns successfully" Dec 12 17:26:27.598396 containerd[1562]: time="2025-12-12T17:26:27.598343622Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:26:27.602901 systemd[1]: cri-containerd-fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42.scope: Deactivated successfully. Dec 12 17:26:27.603366 systemd[1]: cri-containerd-fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42.scope: Consumed 521ms CPU time, 185.5M memory peak, 165.9M written to disk. Dec 12 17:26:27.609316 containerd[1562]: time="2025-12-12T17:26:27.609262144Z" level=info msg="received container exit event container_id:\"fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42\" id:\"fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42\" pid:3541 exited_at:{seconds:1765560387 nanos:608968564}" Dec 12 17:26:27.636974 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fcac7aff7272c322be6a1520dcaf5d7d5bf2ae14315bafaf30e9cac209238f42-rootfs.mount: Deactivated successfully. Dec 12 17:26:27.671990 kubelet[2788]: I1212 17:26:27.671947 2788 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 17:26:27.738838 systemd[1]: Created slice kubepods-burstable-pod6ebd6282_432a_4c60_b9bf_2fc460bd9666.slice - libcontainer container kubepods-burstable-pod6ebd6282_432a_4c60_b9bf_2fc460bd9666.slice. Dec 12 17:26:27.748561 kubelet[2788]: I1212 17:26:27.748460 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvvcx\" (UniqueName: \"kubernetes.io/projected/6ebd6282-432a-4c60-b9bf-2fc460bd9666-kube-api-access-pvvcx\") pod \"coredns-674b8bbfcf-qhtq2\" (UID: \"6ebd6282-432a-4c60-b9bf-2fc460bd9666\") " pod="kube-system/coredns-674b8bbfcf-qhtq2" Dec 12 17:26:27.748561 kubelet[2788]: I1212 17:26:27.748504 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a29c7f30-65b8-4e18-856c-c9d5d13aec46-config-volume\") pod \"coredns-674b8bbfcf-vfbql\" (UID: \"a29c7f30-65b8-4e18-856c-c9d5d13aec46\") " pod="kube-system/coredns-674b8bbfcf-vfbql" Dec 12 17:26:27.748561 kubelet[2788]: I1212 17:26:27.748527 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfmw6\" (UniqueName: \"kubernetes.io/projected/a29c7f30-65b8-4e18-856c-c9d5d13aec46-kube-api-access-jfmw6\") pod \"coredns-674b8bbfcf-vfbql\" (UID: \"a29c7f30-65b8-4e18-856c-c9d5d13aec46\") " pod="kube-system/coredns-674b8bbfcf-vfbql" Dec 12 17:26:27.748561 kubelet[2788]: I1212 17:26:27.748548 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlm8\" (UniqueName: \"kubernetes.io/projected/c7baeefe-1ae7-4ac7-a668-035dfb7baaef-kube-api-access-xwlm8\") pod \"calico-kube-controllers-8f9f7d5ff-pj77s\" (UID: \"c7baeefe-1ae7-4ac7-a668-035dfb7baaef\") " pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" Dec 12 17:26:27.748561 kubelet[2788]: I1212 17:26:27.748563 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ebd6282-432a-4c60-b9bf-2fc460bd9666-config-volume\") pod \"coredns-674b8bbfcf-qhtq2\" (UID: \"6ebd6282-432a-4c60-b9bf-2fc460bd9666\") " pod="kube-system/coredns-674b8bbfcf-qhtq2" Dec 12 17:26:27.748781 kubelet[2788]: I1212 17:26:27.748580 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7baeefe-1ae7-4ac7-a668-035dfb7baaef-tigera-ca-bundle\") pod \"calico-kube-controllers-8f9f7d5ff-pj77s\" (UID: \"c7baeefe-1ae7-4ac7-a668-035dfb7baaef\") " pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" Dec 12 17:26:27.753938 systemd[1]: Created slice kubepods-burstable-poda29c7f30_65b8_4e18_856c_c9d5d13aec46.slice - libcontainer container kubepods-burstable-poda29c7f30_65b8_4e18_856c_c9d5d13aec46.slice. Dec 12 17:26:27.766158 systemd[1]: Created slice kubepods-besteffort-podc7baeefe_1ae7_4ac7_a668_035dfb7baaef.slice - libcontainer container kubepods-besteffort-podc7baeefe_1ae7_4ac7_a668_035dfb7baaef.slice. Dec 12 17:26:27.776783 systemd[1]: Created slice kubepods-besteffort-pod2598be35_bdd7_4b8f_994d_8273d0db5ae9.slice - libcontainer container kubepods-besteffort-pod2598be35_bdd7_4b8f_994d_8273d0db5ae9.slice. Dec 12 17:26:27.792021 systemd[1]: Created slice kubepods-besteffort-podddeb7989_7e65_4a92_bc15_0517e755a359.slice - libcontainer container kubepods-besteffort-podddeb7989_7e65_4a92_bc15_0517e755a359.slice. Dec 12 17:26:27.802116 systemd[1]: Created slice kubepods-besteffort-pod96b8454b_22d4_4695_8613_4bfabbdf8fc4.slice - libcontainer container kubepods-besteffort-pod96b8454b_22d4_4695_8613_4bfabbdf8fc4.slice. Dec 12 17:26:27.813026 systemd[1]: Created slice kubepods-besteffort-podb2a6e3f8_60c3_41cd_b9ef_8e00c63a997f.slice - libcontainer container kubepods-besteffort-podb2a6e3f8_60c3_41cd_b9ef_8e00c63a997f.slice. Dec 12 17:26:27.849965 kubelet[2788]: I1212 17:26:27.849442 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f-config\") pod \"goldmane-666569f655-mf55c\" (UID: \"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f\") " pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:27.849965 kubelet[2788]: I1212 17:26:27.849491 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n2ms\" (UniqueName: \"kubernetes.io/projected/b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f-kube-api-access-6n2ms\") pod \"goldmane-666569f655-mf55c\" (UID: \"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f\") " pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:27.849965 kubelet[2788]: I1212 17:26:27.849521 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2598be35-bdd7-4b8f-994d-8273d0db5ae9-calico-apiserver-certs\") pod \"calico-apiserver-74c4865b69-4lwrd\" (UID: \"2598be35-bdd7-4b8f-994d-8273d0db5ae9\") " pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" Dec 12 17:26:27.849965 kubelet[2788]: I1212 17:26:27.849550 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdt9m\" (UniqueName: \"kubernetes.io/projected/96b8454b-22d4-4695-8613-4bfabbdf8fc4-kube-api-access-gdt9m\") pod \"calico-apiserver-74c4865b69-rq46q\" (UID: \"96b8454b-22d4-4695-8613-4bfabbdf8fc4\") " pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" Dec 12 17:26:27.849965 kubelet[2788]: I1212 17:26:27.849566 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-backend-key-pair\") pod \"whisker-df57b5dbf-7jtrg\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " pod="calico-system/whisker-df57b5dbf-7jtrg" Dec 12 17:26:27.850174 kubelet[2788]: I1212 17:26:27.849582 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/96b8454b-22d4-4695-8613-4bfabbdf8fc4-calico-apiserver-certs\") pod \"calico-apiserver-74c4865b69-rq46q\" (UID: \"96b8454b-22d4-4695-8613-4bfabbdf8fc4\") " pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" Dec 12 17:26:27.850174 kubelet[2788]: I1212 17:26:27.849597 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f-goldmane-ca-bundle\") pod \"goldmane-666569f655-mf55c\" (UID: \"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f\") " pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:27.850174 kubelet[2788]: I1212 17:26:27.849637 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-ca-bundle\") pod \"whisker-df57b5dbf-7jtrg\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " pod="calico-system/whisker-df57b5dbf-7jtrg" Dec 12 17:26:27.850174 kubelet[2788]: I1212 17:26:27.849653 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f-goldmane-key-pair\") pod \"goldmane-666569f655-mf55c\" (UID: \"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f\") " pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:27.850174 kubelet[2788]: I1212 17:26:27.849668 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79zc\" (UniqueName: \"kubernetes.io/projected/2598be35-bdd7-4b8f-994d-8273d0db5ae9-kube-api-access-l79zc\") pod \"calico-apiserver-74c4865b69-4lwrd\" (UID: \"2598be35-bdd7-4b8f-994d-8273d0db5ae9\") " pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" Dec 12 17:26:27.850333 kubelet[2788]: I1212 17:26:27.849695 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgkt\" (UniqueName: \"kubernetes.io/projected/ddeb7989-7e65-4a92-bc15-0517e755a359-kube-api-access-twgkt\") pod \"whisker-df57b5dbf-7jtrg\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " pod="calico-system/whisker-df57b5dbf-7jtrg" Dec 12 17:26:28.050602 containerd[1562]: time="2025-12-12T17:26:28.050448977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qhtq2,Uid:6ebd6282-432a-4c60-b9bf-2fc460bd9666,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:28.063985 containerd[1562]: time="2025-12-12T17:26:28.063926934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfbql,Uid:a29c7f30-65b8-4e18-856c-c9d5d13aec46,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:28.075892 containerd[1562]: time="2025-12-12T17:26:28.075503482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8f9f7d5ff-pj77s,Uid:c7baeefe-1ae7-4ac7-a668-035dfb7baaef,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:28.089674 containerd[1562]: time="2025-12-12T17:26:28.089633564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-4lwrd,Uid:2598be35-bdd7-4b8f-994d-8273d0db5ae9,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:28.099496 containerd[1562]: time="2025-12-12T17:26:28.099380267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-df57b5dbf-7jtrg,Uid:ddeb7989-7e65-4a92-bc15-0517e755a359,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:28.112153 containerd[1562]: time="2025-12-12T17:26:28.111800873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-rq46q,Uid:96b8454b-22d4-4695-8613-4bfabbdf8fc4,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:28.114182 systemd[1]: Created slice kubepods-besteffort-podb3554d30_e274_4b18_8389_696d2cc03c37.slice - libcontainer container kubepods-besteffort-podb3554d30_e274_4b18_8389_696d2cc03c37.slice. Dec 12 17:26:28.117902 containerd[1562]: time="2025-12-12T17:26:28.117847644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qj8f4,Uid:b3554d30-e274-4b18-8389-696d2cc03c37,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:28.120295 containerd[1562]: time="2025-12-12T17:26:28.120005791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mf55c,Uid:b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:28.290427 containerd[1562]: time="2025-12-12T17:26:28.290285501Z" level=error msg="Failed to destroy network for sandbox \"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.293206 containerd[1562]: time="2025-12-12T17:26:28.293124694Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfbql,Uid:a29c7f30-65b8-4e18-856c-c9d5d13aec46,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.295893 kubelet[2788]: E1212 17:26:28.295717 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.295893 kubelet[2788]: E1212 17:26:28.295806 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfbql" Dec 12 17:26:28.295893 kubelet[2788]: E1212 17:26:28.295829 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfbql" Dec 12 17:26:28.296091 kubelet[2788]: E1212 17:26:28.295992 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vfbql_kube-system(a29c7f30-65b8-4e18-856c-c9d5d13aec46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vfbql_kube-system(a29c7f30-65b8-4e18-856c-c9d5d13aec46)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ec45200417ebcd4d064f75633926d0be5bd0687a7f55f04d6e1a955f4f5bd1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vfbql" podUID="a29c7f30-65b8-4e18-856c-c9d5d13aec46" Dec 12 17:26:28.313278 containerd[1562]: time="2025-12-12T17:26:28.313132936Z" level=error msg="Failed to destroy network for sandbox \"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.319065 containerd[1562]: time="2025-12-12T17:26:28.318754318Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qhtq2,Uid:6ebd6282-432a-4c60-b9bf-2fc460bd9666,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.321671 containerd[1562]: time="2025-12-12T17:26:28.320846501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:26:28.322308 kubelet[2788]: E1212 17:26:28.321891 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.322308 kubelet[2788]: E1212 17:26:28.321950 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qhtq2" Dec 12 17:26:28.322308 kubelet[2788]: E1212 17:26:28.321970 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qhtq2" Dec 12 17:26:28.322461 kubelet[2788]: E1212 17:26:28.322027 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qhtq2_kube-system(6ebd6282-432a-4c60-b9bf-2fc460bd9666)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qhtq2_kube-system(6ebd6282-432a-4c60-b9bf-2fc460bd9666)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c9bba40e23b8ffc1fd66197fd1818b9f643b41389c4dbc071945caf85c37c26a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qhtq2" podUID="6ebd6282-432a-4c60-b9bf-2fc460bd9666" Dec 12 17:26:28.337569 containerd[1562]: time="2025-12-12T17:26:28.337355224Z" level=error msg="Failed to destroy network for sandbox \"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.343470 containerd[1562]: time="2025-12-12T17:26:28.343410076Z" level=error msg="Failed to destroy network for sandbox \"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.344184 containerd[1562]: time="2025-12-12T17:26:28.344145727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8f9f7d5ff-pj77s,Uid:c7baeefe-1ae7-4ac7-a668-035dfb7baaef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.344656 kubelet[2788]: E1212 17:26:28.344610 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.344747 kubelet[2788]: E1212 17:26:28.344669 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" Dec 12 17:26:28.344747 kubelet[2788]: E1212 17:26:28.344688 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" Dec 12 17:26:28.344810 kubelet[2788]: E1212 17:26:28.344758 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8758789e4d723c1366ec2ee7f46b770b2a981f7968f5e0aa807410d0e13cfb49\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:26:28.347103 containerd[1562]: time="2025-12-12T17:26:28.346554050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-4lwrd,Uid:2598be35-bdd7-4b8f-994d-8273d0db5ae9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.347754 kubelet[2788]: E1212 17:26:28.347425 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.347754 kubelet[2788]: E1212 17:26:28.347495 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" Dec 12 17:26:28.347754 kubelet[2788]: E1212 17:26:28.347516 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" Dec 12 17:26:28.347985 kubelet[2788]: E1212 17:26:28.347558 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9181206a56f721b4224c496ba74dbc523370eca3060d14689f8272e052006ad0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:26:28.381903 containerd[1562]: time="2025-12-12T17:26:28.381050478Z" level=error msg="Failed to destroy network for sandbox \"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.385796 containerd[1562]: time="2025-12-12T17:26:28.385656552Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qj8f4,Uid:b3554d30-e274-4b18-8389-696d2cc03c37,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.386147 kubelet[2788]: E1212 17:26:28.385972 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.386147 kubelet[2788]: E1212 17:26:28.386028 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:28.386147 kubelet[2788]: E1212 17:26:28.386053 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qj8f4" Dec 12 17:26:28.387133 kubelet[2788]: E1212 17:26:28.386102 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1cefee69ecbb9eec0739c38bfa0f3224d5de76f0761b44fe8206b77837b3f45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:28.387588 containerd[1562]: time="2025-12-12T17:26:28.387533480Z" level=error msg="Failed to destroy network for sandbox \"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.390387 containerd[1562]: time="2025-12-12T17:26:28.390163859Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-rq46q,Uid:96b8454b-22d4-4695-8613-4bfabbdf8fc4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.391260 kubelet[2788]: E1212 17:26:28.391199 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.391404 kubelet[2788]: E1212 17:26:28.391322 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" Dec 12 17:26:28.391404 kubelet[2788]: E1212 17:26:28.391345 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" Dec 12 17:26:28.392005 kubelet[2788]: E1212 17:26:28.391948 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"71e9c04dfd8b217dc57e9b21347742172b4995e10e628652d3c2f1d1da7c727e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:26:28.395095 containerd[1562]: time="2025-12-12T17:26:28.395025870Z" level=error msg="Failed to destroy network for sandbox \"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.398201 containerd[1562]: time="2025-12-12T17:26:28.398115000Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mf55c,Uid:b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.398834 kubelet[2788]: E1212 17:26:28.398358 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.398834 kubelet[2788]: E1212 17:26:28.398420 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:28.398834 kubelet[2788]: E1212 17:26:28.398440 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-mf55c" Dec 12 17:26:28.399300 kubelet[2788]: E1212 17:26:28.398588 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e48a82070a9ccfd307367d21ca3621529e5900abfa3635bb47364dc8a38eac5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:26:28.407180 containerd[1562]: time="2025-12-12T17:26:28.407118573Z" level=error msg="Failed to destroy network for sandbox \"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.408535 containerd[1562]: time="2025-12-12T17:26:28.408461024Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-df57b5dbf-7jtrg,Uid:ddeb7989-7e65-4a92-bc15-0517e755a359,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.409132 kubelet[2788]: E1212 17:26:28.408781 2788 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:28.409602 kubelet[2788]: E1212 17:26:28.409475 2788 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-df57b5dbf-7jtrg" Dec 12 17:26:28.409882 kubelet[2788]: E1212 17:26:28.409569 2788 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-df57b5dbf-7jtrg" Dec 12 17:26:28.410508 kubelet[2788]: E1212 17:26:28.410324 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-df57b5dbf-7jtrg_calico-system(ddeb7989-7e65-4a92-bc15-0517e755a359)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-df57b5dbf-7jtrg_calico-system(ddeb7989-7e65-4a92-bc15-0517e755a359)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93bacee9ffc3cd1fa2600f30133c5dda87c49d6c460a3bd5a56ed94bae122681\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-df57b5dbf-7jtrg" podUID="ddeb7989-7e65-4a92-bc15-0517e755a359" Dec 12 17:26:32.654554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1896646263.mount: Deactivated successfully. Dec 12 17:26:32.682748 containerd[1562]: time="2025-12-12T17:26:32.681598122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.682748 containerd[1562]: time="2025-12-12T17:26:32.682692909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:26:32.683470 containerd[1562]: time="2025-12-12T17:26:32.683405393Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.690470 containerd[1562]: time="2025-12-12T17:26:32.690392263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.691101 containerd[1562]: time="2025-12-12T17:26:32.691042943Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.369396749s" Dec 12 17:26:32.691101 containerd[1562]: time="2025-12-12T17:26:32.691091386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:26:32.713082 containerd[1562]: time="2025-12-12T17:26:32.713035298Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:26:32.728136 containerd[1562]: time="2025-12-12T17:26:32.728069944Z" level=info msg="Container 531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:32.732133 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1046923208.mount: Deactivated successfully. Dec 12 17:26:32.743464 containerd[1562]: time="2025-12-12T17:26:32.743264560Z" level=info msg="CreateContainer within sandbox \"c9998c49c1f5ecc943243ada1847cf2d4f9883591d4475ad04f01ec7af260479\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a\"" Dec 12 17:26:32.745092 containerd[1562]: time="2025-12-12T17:26:32.745040510Z" level=info msg="StartContainer for \"531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a\"" Dec 12 17:26:32.748145 containerd[1562]: time="2025-12-12T17:26:32.748076097Z" level=info msg="connecting to shim 531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a" address="unix:///run/containerd/s/a982f26370673afd9bf15d207ae8b71239526c0cbfbd33a0bcb11acf38099a9c" protocol=ttrpc version=3 Dec 12 17:26:32.770070 systemd[1]: Started cri-containerd-531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a.scope - libcontainer container 531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a. Dec 12 17:26:32.856495 containerd[1562]: time="2025-12-12T17:26:32.856420371Z" level=info msg="StartContainer for \"531ae61246f121d2f472eba492fecf1b6cd2539d0afcb48c939453c04843df0a\" returns successfully" Dec 12 17:26:33.029240 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:26:33.029368 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:26:33.292620 kubelet[2788]: I1212 17:26:33.292502 2788 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twgkt\" (UniqueName: \"kubernetes.io/projected/ddeb7989-7e65-4a92-bc15-0517e755a359-kube-api-access-twgkt\") pod \"ddeb7989-7e65-4a92-bc15-0517e755a359\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " Dec 12 17:26:33.293853 kubelet[2788]: I1212 17:26:33.293192 2788 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-backend-key-pair\") pod \"ddeb7989-7e65-4a92-bc15-0517e755a359\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " Dec 12 17:26:33.293853 kubelet[2788]: I1212 17:26:33.293235 2788 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-ca-bundle\") pod \"ddeb7989-7e65-4a92-bc15-0517e755a359\" (UID: \"ddeb7989-7e65-4a92-bc15-0517e755a359\") " Dec 12 17:26:33.293853 kubelet[2788]: I1212 17:26:33.293624 2788 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ddeb7989-7e65-4a92-bc15-0517e755a359" (UID: "ddeb7989-7e65-4a92-bc15-0517e755a359"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:26:33.298282 kubelet[2788]: I1212 17:26:33.298234 2788 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ddeb7989-7e65-4a92-bc15-0517e755a359" (UID: "ddeb7989-7e65-4a92-bc15-0517e755a359"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:26:33.299043 kubelet[2788]: I1212 17:26:33.299014 2788 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddeb7989-7e65-4a92-bc15-0517e755a359-kube-api-access-twgkt" (OuterVolumeSpecName: "kube-api-access-twgkt") pod "ddeb7989-7e65-4a92-bc15-0517e755a359" (UID: "ddeb7989-7e65-4a92-bc15-0517e755a359"). InnerVolumeSpecName "kube-api-access-twgkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:26:33.337824 systemd[1]: Removed slice kubepods-besteffort-podddeb7989_7e65_4a92_bc15_0517e755a359.slice - libcontainer container kubepods-besteffort-podddeb7989_7e65_4a92_bc15_0517e755a359.slice. Dec 12 17:26:33.365250 kubelet[2788]: I1212 17:26:33.365161 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ds6dz" podStartSLOduration=1.6676175880000002 podStartE2EDuration="19.3650282s" podCreationTimestamp="2025-12-12 17:26:14 +0000 UTC" firstStartedPulling="2025-12-12 17:26:14.995690818 +0000 UTC m=+26.027784663" lastFinishedPulling="2025-12-12 17:26:32.69310147 +0000 UTC m=+43.725195275" observedRunningTime="2025-12-12 17:26:33.363669318 +0000 UTC m=+44.395763163" watchObservedRunningTime="2025-12-12 17:26:33.3650282 +0000 UTC m=+44.397122045" Dec 12 17:26:33.394337 kubelet[2788]: I1212 17:26:33.394203 2788 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twgkt\" (UniqueName: \"kubernetes.io/projected/ddeb7989-7e65-4a92-bc15-0517e755a359-kube-api-access-twgkt\") on node \"ci-4459-2-2-0-24adfa6772\" DevicePath \"\"" Dec 12 17:26:33.394654 kubelet[2788]: I1212 17:26:33.394636 2788 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-backend-key-pair\") on node \"ci-4459-2-2-0-24adfa6772\" DevicePath \"\"" Dec 12 17:26:33.394654 kubelet[2788]: I1212 17:26:33.394659 2788 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddeb7989-7e65-4a92-bc15-0517e755a359-whisker-ca-bundle\") on node \"ci-4459-2-2-0-24adfa6772\" DevicePath \"\"" Dec 12 17:26:33.453386 systemd[1]: Created slice kubepods-besteffort-pod43e43bdb_6d56_4012_94dd_d2daf57ba1c9.slice - libcontainer container kubepods-besteffort-pod43e43bdb_6d56_4012_94dd_d2daf57ba1c9.slice. Dec 12 17:26:33.495669 kubelet[2788]: I1212 17:26:33.495623 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43e43bdb-6d56-4012-94dd-d2daf57ba1c9-whisker-ca-bundle\") pod \"whisker-5cf75855f7-k7v95\" (UID: \"43e43bdb-6d56-4012-94dd-d2daf57ba1c9\") " pod="calico-system/whisker-5cf75855f7-k7v95" Dec 12 17:26:33.495669 kubelet[2788]: I1212 17:26:33.495666 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbr4\" (UniqueName: \"kubernetes.io/projected/43e43bdb-6d56-4012-94dd-d2daf57ba1c9-kube-api-access-7rbr4\") pod \"whisker-5cf75855f7-k7v95\" (UID: \"43e43bdb-6d56-4012-94dd-d2daf57ba1c9\") " pod="calico-system/whisker-5cf75855f7-k7v95" Dec 12 17:26:33.495920 kubelet[2788]: I1212 17:26:33.495694 2788 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/43e43bdb-6d56-4012-94dd-d2daf57ba1c9-whisker-backend-key-pair\") pod \"whisker-5cf75855f7-k7v95\" (UID: \"43e43bdb-6d56-4012-94dd-d2daf57ba1c9\") " pod="calico-system/whisker-5cf75855f7-k7v95" Dec 12 17:26:33.661150 systemd[1]: var-lib-kubelet-pods-ddeb7989\x2d7e65\x2d4a92\x2dbc15\x2d0517e755a359-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtwgkt.mount: Deactivated successfully. Dec 12 17:26:33.661939 systemd[1]: var-lib-kubelet-pods-ddeb7989\x2d7e65\x2d4a92\x2dbc15\x2d0517e755a359-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:26:33.758725 containerd[1562]: time="2025-12-12T17:26:33.758669262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cf75855f7-k7v95,Uid:43e43bdb-6d56-4012-94dd-d2daf57ba1c9,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:33.952385 systemd-networkd[1434]: cali340f1ff3288: Link UP Dec 12 17:26:33.953578 systemd-networkd[1434]: cali340f1ff3288: Gained carrier Dec 12 17:26:33.977057 containerd[1562]: 2025-12-12 17:26:33.788 [INFO][3867] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:26:33.977057 containerd[1562]: 2025-12-12 17:26:33.834 [INFO][3867] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0 whisker-5cf75855f7- calico-system 43e43bdb-6d56-4012-94dd-d2daf57ba1c9 928 0 2025-12-12 17:26:33 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5cf75855f7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 whisker-5cf75855f7-k7v95 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali340f1ff3288 [] [] }} ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-" Dec 12 17:26:33.977057 containerd[1562]: 2025-12-12 17:26:33.834 [INFO][3867] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.977057 containerd[1562]: 2025-12-12 17:26:33.886 [INFO][3880] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" HandleID="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Workload="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.886 [INFO][3880] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" HandleID="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Workload="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"whisker-5cf75855f7-k7v95", "timestamp":"2025-12-12 17:26:33.886411661 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.886 [INFO][3880] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.886 [INFO][3880] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.886 [INFO][3880] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.899 [INFO][3880] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.906 [INFO][3880] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.912 [INFO][3880] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.916 [INFO][3880] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977330 containerd[1562]: 2025-12-12 17:26:33.921 [INFO][3880] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.921 [INFO][3880] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.925 [INFO][3880] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.930 [INFO][3880] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.939 [INFO][3880] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.193/26] block=192.168.114.192/26 handle="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.939 [INFO][3880] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.193/26] handle="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.940 [INFO][3880] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:33.977608 containerd[1562]: 2025-12-12 17:26:33.940 [INFO][3880] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.193/26] IPv6=[] ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" HandleID="k8s-pod-network.8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Workload="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.977805 containerd[1562]: 2025-12-12 17:26:33.943 [INFO][3867] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0", GenerateName:"whisker-5cf75855f7-", Namespace:"calico-system", SelfLink:"", UID:"43e43bdb-6d56-4012-94dd-d2daf57ba1c9", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cf75855f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"whisker-5cf75855f7-k7v95", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali340f1ff3288", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:33.977805 containerd[1562]: 2025-12-12 17:26:33.944 [INFO][3867] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.193/32] ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.977984 containerd[1562]: 2025-12-12 17:26:33.944 [INFO][3867] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali340f1ff3288 ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.977984 containerd[1562]: 2025-12-12 17:26:33.954 [INFO][3867] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:33.978063 containerd[1562]: 2025-12-12 17:26:33.956 [INFO][3867] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0", GenerateName:"whisker-5cf75855f7-", Namespace:"calico-system", SelfLink:"", UID:"43e43bdb-6d56-4012-94dd-d2daf57ba1c9", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cf75855f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa", Pod:"whisker-5cf75855f7-k7v95", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali340f1ff3288", MAC:"12:cb:9b:a3:15:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:33.978132 containerd[1562]: 2025-12-12 17:26:33.971 [INFO][3867] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" Namespace="calico-system" Pod="whisker-5cf75855f7-k7v95" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-whisker--5cf75855f7--k7v95-eth0" Dec 12 17:26:34.020963 containerd[1562]: time="2025-12-12T17:26:34.020858674Z" level=info msg="connecting to shim 8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa" address="unix:///run/containerd/s/82198a9a0eff9e5ee755a7c861bee4f921eca3d635f3143a0742298f444d847d" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:34.051161 systemd[1]: Started cri-containerd-8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa.scope - libcontainer container 8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa. Dec 12 17:26:34.102402 containerd[1562]: time="2025-12-12T17:26:34.102357976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cf75855f7-k7v95,Uid:43e43bdb-6d56-4012-94dd-d2daf57ba1c9,Namespace:calico-system,Attempt:0,} returns sandbox id \"8a98644be3be9bda4fa5d69301c7aaecc39264d4f3edf37786d51da86e078afa\"" Dec 12 17:26:34.104387 containerd[1562]: time="2025-12-12T17:26:34.104314371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:26:34.441307 containerd[1562]: time="2025-12-12T17:26:34.441246378Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:34.443926 containerd[1562]: time="2025-12-12T17:26:34.443793767Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:26:34.444053 containerd[1562]: time="2025-12-12T17:26:34.443844170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:26:34.444299 kubelet[2788]: E1212 17:26:34.444237 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:34.445229 kubelet[2788]: E1212 17:26:34.444323 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:34.454985 kubelet[2788]: E1212 17:26:34.454911 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c902c75b8da444adacbeeefd6f418d52,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:34.457269 containerd[1562]: time="2025-12-12T17:26:34.457216715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:26:34.807437 containerd[1562]: time="2025-12-12T17:26:34.806902590Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:34.809997 containerd[1562]: time="2025-12-12T17:26:34.809855923Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:26:34.809997 containerd[1562]: time="2025-12-12T17:26:34.809960529Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:34.810204 kubelet[2788]: E1212 17:26:34.810159 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:34.810804 kubelet[2788]: E1212 17:26:34.810255 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:34.810940 kubelet[2788]: E1212 17:26:34.810684 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:34.812497 kubelet[2788]: E1212 17:26:34.812357 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:26:35.052101 systemd-networkd[1434]: cali340f1ff3288: Gained IPv6LL Dec 12 17:26:35.109260 kubelet[2788]: I1212 17:26:35.108746 2788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddeb7989-7e65-4a92-bc15-0517e755a359" path="/var/lib/kubelet/pods/ddeb7989-7e65-4a92-bc15-0517e755a359/volumes" Dec 12 17:26:35.166629 systemd-networkd[1434]: vxlan.calico: Link UP Dec 12 17:26:35.166641 systemd-networkd[1434]: vxlan.calico: Gained carrier Dec 12 17:26:35.346588 kubelet[2788]: E1212 17:26:35.346483 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:26:37.165057 systemd-networkd[1434]: vxlan.calico: Gained IPv6LL Dec 12 17:26:39.110274 containerd[1562]: time="2025-12-12T17:26:39.110223617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-rq46q,Uid:96b8454b-22d4-4695-8613-4bfabbdf8fc4,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:39.285121 systemd-networkd[1434]: cali04675ea39bf: Link UP Dec 12 17:26:39.289364 systemd-networkd[1434]: cali04675ea39bf: Gained carrier Dec 12 17:26:39.324565 containerd[1562]: 2025-12-12 17:26:39.166 [INFO][4235] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0 calico-apiserver-74c4865b69- calico-apiserver 96b8454b-22d4-4695-8613-4bfabbdf8fc4 867 0 2025-12-12 17:26:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:74c4865b69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 calico-apiserver-74c4865b69-rq46q eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali04675ea39bf [] [] }} ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-" Dec 12 17:26:39.324565 containerd[1562]: 2025-12-12 17:26:39.166 [INFO][4235] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.324565 containerd[1562]: 2025-12-12 17:26:39.213 [INFO][4246] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" HandleID="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.215 [INFO][4246] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" HandleID="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b050), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-0-24adfa6772", "pod":"calico-apiserver-74c4865b69-rq46q", "timestamp":"2025-12-12 17:26:39.21373549 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.215 [INFO][4246] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.216 [INFO][4246] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.216 [INFO][4246] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.228 [INFO][4246] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.235 [INFO][4246] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.241 [INFO][4246] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.244 [INFO][4246] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325334 containerd[1562]: 2025-12-12 17:26:39.247 [INFO][4246] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.247 [INFO][4246] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.249 [INFO][4246] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.256 [INFO][4246] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.267 [INFO][4246] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.194/26] block=192.168.114.192/26 handle="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.268 [INFO][4246] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.194/26] handle="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.268 [INFO][4246] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:39.325593 containerd[1562]: 2025-12-12 17:26:39.268 [INFO][4246] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.194/26] IPv6=[] ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" HandleID="k8s-pod-network.c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.325808 containerd[1562]: 2025-12-12 17:26:39.272 [INFO][4235] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0", GenerateName:"calico-apiserver-74c4865b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"96b8454b-22d4-4695-8613-4bfabbdf8fc4", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"74c4865b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"calico-apiserver-74c4865b69-rq46q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali04675ea39bf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:39.327933 containerd[1562]: 2025-12-12 17:26:39.272 [INFO][4235] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.194/32] ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.327933 containerd[1562]: 2025-12-12 17:26:39.273 [INFO][4235] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali04675ea39bf ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.327933 containerd[1562]: 2025-12-12 17:26:39.287 [INFO][4235] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.328267 containerd[1562]: 2025-12-12 17:26:39.289 [INFO][4235] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0", GenerateName:"calico-apiserver-74c4865b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"96b8454b-22d4-4695-8613-4bfabbdf8fc4", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"74c4865b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb", Pod:"calico-apiserver-74c4865b69-rq46q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali04675ea39bf", MAC:"72:74:bb:b7:61:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:39.330106 containerd[1562]: 2025-12-12 17:26:39.312 [INFO][4235] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-rq46q" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--rq46q-eth0" Dec 12 17:26:39.361968 containerd[1562]: time="2025-12-12T17:26:39.361521749Z" level=info msg="connecting to shim c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb" address="unix:///run/containerd/s/87f9db0947c2850167ee60997cdb3f6b9295386c3b35c22763dac2e5d717151c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:39.406294 systemd[1]: Started cri-containerd-c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb.scope - libcontainer container c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb. Dec 12 17:26:39.452964 containerd[1562]: time="2025-12-12T17:26:39.452846706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-rq46q,Uid:96b8454b-22d4-4695-8613-4bfabbdf8fc4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c986d65a4edd909d74085dca64e35b5220a4abb6e25ac7ab5c2c87b0d76f73eb\"" Dec 12 17:26:39.456270 containerd[1562]: time="2025-12-12T17:26:39.456218562Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:39.809618 containerd[1562]: time="2025-12-12T17:26:39.809469965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:39.811055 containerd[1562]: time="2025-12-12T17:26:39.810987044Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:39.811235 containerd[1562]: time="2025-12-12T17:26:39.811115931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:39.811631 kubelet[2788]: E1212 17:26:39.811577 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:39.812903 kubelet[2788]: E1212 17:26:39.812336 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:39.812903 kubelet[2788]: E1212 17:26:39.812754 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdt9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:39.813949 kubelet[2788]: E1212 17:26:39.813897 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:26:40.106069 containerd[1562]: time="2025-12-12T17:26:40.105110722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-4lwrd,Uid:2598be35-bdd7-4b8f-994d-8273d0db5ae9,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:40.106069 containerd[1562]: time="2025-12-12T17:26:40.105126883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8f9f7d5ff-pj77s,Uid:c7baeefe-1ae7-4ac7-a668-035dfb7baaef,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:40.273222 systemd-networkd[1434]: cali4229fc4194b: Link UP Dec 12 17:26:40.275467 systemd-networkd[1434]: cali4229fc4194b: Gained carrier Dec 12 17:26:40.295101 containerd[1562]: 2025-12-12 17:26:40.169 [INFO][4309] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0 calico-kube-controllers-8f9f7d5ff- calico-system c7baeefe-1ae7-4ac7-a668-035dfb7baaef 865 0 2025-12-12 17:26:14 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8f9f7d5ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 calico-kube-controllers-8f9f7d5ff-pj77s eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4229fc4194b [] [] }} ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-" Dec 12 17:26:40.295101 containerd[1562]: 2025-12-12 17:26:40.169 [INFO][4309] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.295101 containerd[1562]: 2025-12-12 17:26:40.210 [INFO][4339] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" HandleID="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.210 [INFO][4339] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" HandleID="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3830), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"calico-kube-controllers-8f9f7d5ff-pj77s", "timestamp":"2025-12-12 17:26:40.210006102 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.210 [INFO][4339] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.210 [INFO][4339] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.210 [INFO][4339] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.221 [INFO][4339] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.228 [INFO][4339] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.238 [INFO][4339] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.241 [INFO][4339] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295557 containerd[1562]: 2025-12-12 17:26:40.245 [INFO][4339] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.245 [INFO][4339] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.248 [INFO][4339] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.254 [INFO][4339] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.263 [INFO][4339] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.195/26] block=192.168.114.192/26 handle="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.263 [INFO][4339] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.195/26] handle="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.263 [INFO][4339] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:40.295788 containerd[1562]: 2025-12-12 17:26:40.264 [INFO][4339] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.195/26] IPv6=[] ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" HandleID="k8s-pod-network.7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.295992 containerd[1562]: 2025-12-12 17:26:40.269 [INFO][4309] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0", GenerateName:"calico-kube-controllers-8f9f7d5ff-", Namespace:"calico-system", SelfLink:"", UID:"c7baeefe-1ae7-4ac7-a668-035dfb7baaef", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8f9f7d5ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"calico-kube-controllers-8f9f7d5ff-pj77s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4229fc4194b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.296050 containerd[1562]: 2025-12-12 17:26:40.269 [INFO][4309] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.195/32] ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.296050 containerd[1562]: 2025-12-12 17:26:40.269 [INFO][4309] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4229fc4194b ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.296050 containerd[1562]: 2025-12-12 17:26:40.275 [INFO][4309] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.296112 containerd[1562]: 2025-12-12 17:26:40.276 [INFO][4309] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0", GenerateName:"calico-kube-controllers-8f9f7d5ff-", Namespace:"calico-system", SelfLink:"", UID:"c7baeefe-1ae7-4ac7-a668-035dfb7baaef", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8f9f7d5ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c", Pod:"calico-kube-controllers-8f9f7d5ff-pj77s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4229fc4194b", MAC:"82:68:b5:a0:91:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.296163 containerd[1562]: 2025-12-12 17:26:40.291 [INFO][4309] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" Namespace="calico-system" Pod="calico-kube-controllers-8f9f7d5ff-pj77s" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--kube--controllers--8f9f7d5ff--pj77s-eth0" Dec 12 17:26:40.333316 containerd[1562]: time="2025-12-12T17:26:40.333242335Z" level=info msg="connecting to shim 7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c" address="unix:///run/containerd/s/98bbbc5a89f0e48e8ae15acf9e071869609a3f4ce2b897367a81e41408d8e308" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:40.365996 kubelet[2788]: E1212 17:26:40.365316 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:26:40.393543 systemd[1]: Started cri-containerd-7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c.scope - libcontainer container 7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c. Dec 12 17:26:40.403069 systemd-networkd[1434]: cali1247fff082e: Link UP Dec 12 17:26:40.403646 systemd-networkd[1434]: cali1247fff082e: Gained carrier Dec 12 17:26:40.433961 containerd[1562]: 2025-12-12 17:26:40.164 [INFO][4317] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0 calico-apiserver-74c4865b69- calico-apiserver 2598be35-bdd7-4b8f-994d-8273d0db5ae9 870 0 2025-12-12 17:26:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:74c4865b69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 calico-apiserver-74c4865b69-4lwrd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1247fff082e [] [] }} ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-" Dec 12 17:26:40.433961 containerd[1562]: 2025-12-12 17:26:40.164 [INFO][4317] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.433961 containerd[1562]: 2025-12-12 17:26:40.212 [INFO][4334] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" HandleID="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.213 [INFO][4334] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" HandleID="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3110), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-0-24adfa6772", "pod":"calico-apiserver-74c4865b69-4lwrd", "timestamp":"2025-12-12 17:26:40.212064567 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.213 [INFO][4334] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.263 [INFO][4334] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.264 [INFO][4334] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.324 [INFO][4334] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.332 [INFO][4334] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.342 [INFO][4334] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.350 [INFO][4334] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434181 containerd[1562]: 2025-12-12 17:26:40.355 [INFO][4334] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.356 [INFO][4334] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.359 [INFO][4334] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.370 [INFO][4334] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.385 [INFO][4334] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.196/26] block=192.168.114.192/26 handle="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.386 [INFO][4334] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.196/26] handle="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.389 [INFO][4334] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:40.434387 containerd[1562]: 2025-12-12 17:26:40.390 [INFO][4334] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.196/26] IPv6=[] ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" HandleID="k8s-pod-network.1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Workload="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.435337 containerd[1562]: 2025-12-12 17:26:40.396 [INFO][4317] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0", GenerateName:"calico-apiserver-74c4865b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"2598be35-bdd7-4b8f-994d-8273d0db5ae9", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"74c4865b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"calico-apiserver-74c4865b69-4lwrd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1247fff082e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.435466 containerd[1562]: 2025-12-12 17:26:40.396 [INFO][4317] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.196/32] ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.435466 containerd[1562]: 2025-12-12 17:26:40.396 [INFO][4317] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1247fff082e ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.435466 containerd[1562]: 2025-12-12 17:26:40.405 [INFO][4317] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.435538 containerd[1562]: 2025-12-12 17:26:40.406 [INFO][4317] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0", GenerateName:"calico-apiserver-74c4865b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"2598be35-bdd7-4b8f-994d-8273d0db5ae9", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"74c4865b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b", Pod:"calico-apiserver-74c4865b69-4lwrd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1247fff082e", MAC:"26:d5:4b:1d:56:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.435657 containerd[1562]: 2025-12-12 17:26:40.430 [INFO][4317] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" Namespace="calico-apiserver" Pod="calico-apiserver-74c4865b69-4lwrd" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-calico--apiserver--74c4865b69--4lwrd-eth0" Dec 12 17:26:40.475626 containerd[1562]: time="2025-12-12T17:26:40.475556019Z" level=info msg="connecting to shim 1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b" address="unix:///run/containerd/s/cbea2f082cf12afddf354a4faa220d2a2e2a22c155a29ed5e3094c112330cdce" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:40.516115 systemd[1]: Started cri-containerd-1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b.scope - libcontainer container 1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b. Dec 12 17:26:40.570845 containerd[1562]: time="2025-12-12T17:26:40.570650539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8f9f7d5ff-pj77s,Uid:c7baeefe-1ae7-4ac7-a668-035dfb7baaef,Namespace:calico-system,Attempt:0,} returns sandbox id \"7bfe4659d520d988dea6f046e4d8a61850552185fcbb0d376526893e0cc9509c\"" Dec 12 17:26:40.573671 containerd[1562]: time="2025-12-12T17:26:40.573629491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:26:40.629777 containerd[1562]: time="2025-12-12T17:26:40.629119036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-74c4865b69-4lwrd,Uid:2598be35-bdd7-4b8f-994d-8273d0db5ae9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1f957e594482f766fe64a23a19cf6a7b921ccd66fc13156e45a79273b4be650b\"" Dec 12 17:26:40.748034 systemd-networkd[1434]: cali04675ea39bf: Gained IPv6LL Dec 12 17:26:40.925858 containerd[1562]: time="2025-12-12T17:26:40.925655570Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:40.927340 containerd[1562]: time="2025-12-12T17:26:40.927262692Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:26:40.927521 containerd[1562]: time="2025-12-12T17:26:40.927372657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:40.928042 kubelet[2788]: E1212 17:26:40.927887 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:40.928418 kubelet[2788]: E1212 17:26:40.928049 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:40.930418 containerd[1562]: time="2025-12-12T17:26:40.930353729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:40.930538 kubelet[2788]: E1212 17:26:40.930470 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:40.931805 kubelet[2788]: E1212 17:26:40.931655 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:26:41.281577 containerd[1562]: time="2025-12-12T17:26:41.281490958Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:41.283423 containerd[1562]: time="2025-12-12T17:26:41.283294008Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:41.283555 containerd[1562]: time="2025-12-12T17:26:41.283357611Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:41.283886 kubelet[2788]: E1212 17:26:41.283779 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:41.283886 kubelet[2788]: E1212 17:26:41.283837 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:41.284230 kubelet[2788]: E1212 17:26:41.284057 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l79zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:41.285430 kubelet[2788]: E1212 17:26:41.285358 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:26:41.370893 kubelet[2788]: E1212 17:26:41.370728 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:26:41.375089 kubelet[2788]: E1212 17:26:41.375037 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:26:41.375463 kubelet[2788]: E1212 17:26:41.375267 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:26:42.028272 systemd-networkd[1434]: cali1247fff082e: Gained IPv6LL Dec 12 17:26:42.105396 containerd[1562]: time="2025-12-12T17:26:42.105341985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qj8f4,Uid:b3554d30-e274-4b18-8389-696d2cc03c37,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:42.156161 systemd-networkd[1434]: cali4229fc4194b: Gained IPv6LL Dec 12 17:26:42.251719 systemd-networkd[1434]: cali80b84e7e958: Link UP Dec 12 17:26:42.253415 systemd-networkd[1434]: cali80b84e7e958: Gained carrier Dec 12 17:26:42.278128 containerd[1562]: 2025-12-12 17:26:42.166 [INFO][4465] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0 csi-node-driver- calico-system b3554d30-e274-4b18-8389-696d2cc03c37 754 0 2025-12-12 17:26:14 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 csi-node-driver-qj8f4 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali80b84e7e958 [] [] }} ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-" Dec 12 17:26:42.278128 containerd[1562]: 2025-12-12 17:26:42.166 [INFO][4465] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.278128 containerd[1562]: 2025-12-12 17:26:42.202 [INFO][4478] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" HandleID="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Workload="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.202 [INFO][4478] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" HandleID="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Workload="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b6b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"csi-node-driver-qj8f4", "timestamp":"2025-12-12 17:26:42.20211469 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.202 [INFO][4478] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.202 [INFO][4478] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.202 [INFO][4478] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.212 [INFO][4478] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.217 [INFO][4478] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.224 [INFO][4478] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.226 [INFO][4478] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278389 containerd[1562]: 2025-12-12 17:26:42.229 [INFO][4478] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.229 [INFO][4478] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.230 [INFO][4478] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50 Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.237 [INFO][4478] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.245 [INFO][4478] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.197/26] block=192.168.114.192/26 handle="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.245 [INFO][4478] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.197/26] handle="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.245 [INFO][4478] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:42.278609 containerd[1562]: 2025-12-12 17:26:42.245 [INFO][4478] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.197/26] IPv6=[] ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" HandleID="k8s-pod-network.09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Workload="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.279672 containerd[1562]: 2025-12-12 17:26:42.248 [INFO][4465] cni-plugin/k8s.go 418: Populated endpoint ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b3554d30-e274-4b18-8389-696d2cc03c37", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"csi-node-driver-qj8f4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali80b84e7e958", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.280036 containerd[1562]: 2025-12-12 17:26:42.248 [INFO][4465] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.197/32] ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.280036 containerd[1562]: 2025-12-12 17:26:42.248 [INFO][4465] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali80b84e7e958 ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.280036 containerd[1562]: 2025-12-12 17:26:42.252 [INFO][4465] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.280425 containerd[1562]: 2025-12-12 17:26:42.253 [INFO][4465] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b3554d30-e274-4b18-8389-696d2cc03c37", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50", Pod:"csi-node-driver-qj8f4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali80b84e7e958", MAC:"a6:f5:8b:3d:96:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.280491 containerd[1562]: 2025-12-12 17:26:42.273 [INFO][4465] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" Namespace="calico-system" Pod="csi-node-driver-qj8f4" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-csi--node--driver--qj8f4-eth0" Dec 12 17:26:42.323487 containerd[1562]: time="2025-12-12T17:26:42.323429229Z" level=info msg="connecting to shim 09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50" address="unix:///run/containerd/s/248fa92b259b0f488fd41db06dfa8133de61c6cbd7eecc8ccfa20b7511c241e9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.355082 systemd[1]: Started cri-containerd-09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50.scope - libcontainer container 09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50. Dec 12 17:26:42.378269 kubelet[2788]: E1212 17:26:42.378058 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:26:42.380128 kubelet[2788]: E1212 17:26:42.379953 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:26:42.387251 containerd[1562]: time="2025-12-12T17:26:42.387165088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qj8f4,Uid:b3554d30-e274-4b18-8389-696d2cc03c37,Namespace:calico-system,Attempt:0,} returns sandbox id \"09a55d58eb7fba3668a9926f43e2d8ba09f5ad1dd7b612d08beead9449415e50\"" Dec 12 17:26:42.406130 containerd[1562]: time="2025-12-12T17:26:42.406080528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:26:42.761090 containerd[1562]: time="2025-12-12T17:26:42.760996106Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:42.762849 containerd[1562]: time="2025-12-12T17:26:42.762746991Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:26:42.763023 containerd[1562]: time="2025-12-12T17:26:42.762934680Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:26:42.763285 kubelet[2788]: E1212 17:26:42.763221 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:42.763285 kubelet[2788]: E1212 17:26:42.763280 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:42.763685 kubelet[2788]: E1212 17:26:42.763627 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:42.766449 containerd[1562]: time="2025-12-12T17:26:42.766405369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:26:43.106313 containerd[1562]: time="2025-12-12T17:26:43.106158617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfbql,Uid:a29c7f30-65b8-4e18-856c-c9d5d13aec46,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:43.123099 containerd[1562]: time="2025-12-12T17:26:43.123044339Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:43.124654 containerd[1562]: time="2025-12-12T17:26:43.124568092Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:26:43.124768 containerd[1562]: time="2025-12-12T17:26:43.124601613Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:26:43.125434 kubelet[2788]: E1212 17:26:43.125001 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:43.125434 kubelet[2788]: E1212 17:26:43.125053 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:43.125434 kubelet[2788]: E1212 17:26:43.125176 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:43.126979 kubelet[2788]: E1212 17:26:43.126895 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:43.262199 systemd-networkd[1434]: calice628b17ca9: Link UP Dec 12 17:26:43.266373 systemd-networkd[1434]: calice628b17ca9: Gained carrier Dec 12 17:26:43.286979 containerd[1562]: 2025-12-12 17:26:43.162 [INFO][4545] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0 coredns-674b8bbfcf- kube-system a29c7f30-65b8-4e18-856c-c9d5d13aec46 869 0 2025-12-12 17:25:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 coredns-674b8bbfcf-vfbql eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calice628b17ca9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-" Dec 12 17:26:43.286979 containerd[1562]: 2025-12-12 17:26:43.162 [INFO][4545] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.286979 containerd[1562]: 2025-12-12 17:26:43.199 [INFO][4553] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" HandleID="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.199 [INFO][4553] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" HandleID="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa3f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"coredns-674b8bbfcf-vfbql", "timestamp":"2025-12-12 17:26:43.19920808 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.199 [INFO][4553] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.199 [INFO][4553] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.199 [INFO][4553] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.213 [INFO][4553] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.220 [INFO][4553] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.227 [INFO][4553] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.230 [INFO][4553] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.287548 containerd[1562]: 2025-12-12 17:26:43.233 [INFO][4553] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.234 [INFO][4553] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.235 [INFO][4553] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.243 [INFO][4553] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.251 [INFO][4553] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.198/26] block=192.168.114.192/26 handle="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.251 [INFO][4553] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.198/26] handle="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.251 [INFO][4553] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:43.288078 containerd[1562]: 2025-12-12 17:26:43.252 [INFO][4553] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.198/26] IPv6=[] ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" HandleID="k8s-pod-network.023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.255 [INFO][4545] cni-plugin/k8s.go 418: Populated endpoint ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a29c7f30-65b8-4e18-856c-c9d5d13aec46", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"coredns-674b8bbfcf-vfbql", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calice628b17ca9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.255 [INFO][4545] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.198/32] ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.255 [INFO][4545] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calice628b17ca9 ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.266 [INFO][4545] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.269 [INFO][4545] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a29c7f30-65b8-4e18-856c-c9d5d13aec46", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac", Pod:"coredns-674b8bbfcf-vfbql", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calice628b17ca9", MAC:"da:0e:9e:c1:0c:55", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:43.288368 containerd[1562]: 2025-12-12 17:26:43.285 [INFO][4545] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfbql" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--vfbql-eth0" Dec 12 17:26:43.318081 containerd[1562]: time="2025-12-12T17:26:43.317705794Z" level=info msg="connecting to shim 023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac" address="unix:///run/containerd/s/e1b02050aaac68f76c3b6d3ed775442e680606b77714447a6724b65b4b2738e2" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:43.361165 systemd[1]: Started cri-containerd-023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac.scope - libcontainer container 023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac. Dec 12 17:26:43.391331 kubelet[2788]: E1212 17:26:43.391071 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:43.436118 systemd-networkd[1434]: cali80b84e7e958: Gained IPv6LL Dec 12 17:26:43.439749 containerd[1562]: time="2025-12-12T17:26:43.438684905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfbql,Uid:a29c7f30-65b8-4e18-856c-c9d5d13aec46,Namespace:kube-system,Attempt:0,} returns sandbox id \"023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac\"" Dec 12 17:26:43.450799 containerd[1562]: time="2025-12-12T17:26:43.448970874Z" level=info msg="CreateContainer within sandbox \"023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:26:43.489973 containerd[1562]: time="2025-12-12T17:26:43.489924781Z" level=info msg="Container a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:43.498568 containerd[1562]: time="2025-12-12T17:26:43.498420545Z" level=info msg="CreateContainer within sandbox \"023e9252f47b818535a262ef506960c0439b33ee0f07cd5d975662d6930b93ac\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a\"" Dec 12 17:26:43.500415 containerd[1562]: time="2025-12-12T17:26:43.500293474Z" level=info msg="StartContainer for \"a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a\"" Dec 12 17:26:43.502924 containerd[1562]: time="2025-12-12T17:26:43.501775904Z" level=info msg="connecting to shim a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a" address="unix:///run/containerd/s/e1b02050aaac68f76c3b6d3ed775442e680606b77714447a6724b65b4b2738e2" protocol=ttrpc version=3 Dec 12 17:26:43.524097 systemd[1]: Started cri-containerd-a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a.scope - libcontainer container a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a. Dec 12 17:26:43.563935 containerd[1562]: time="2025-12-12T17:26:43.563898218Z" level=info msg="StartContainer for \"a116105e6a3e8689f27f15b3638edd513010ad2d3e69fff60019cf9fd86e386a\" returns successfully" Dec 12 17:26:44.105912 containerd[1562]: time="2025-12-12T17:26:44.105568860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mf55c,Uid:b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:44.106737 containerd[1562]: time="2025-12-12T17:26:44.106643910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qhtq2,Uid:6ebd6282-432a-4c60-b9bf-2fc460bd9666,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:44.305063 systemd-networkd[1434]: cali5aab5a5bdfa: Link UP Dec 12 17:26:44.305971 systemd-networkd[1434]: cali5aab5a5bdfa: Gained carrier Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.182 [INFO][4660] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0 coredns-674b8bbfcf- kube-system 6ebd6282-432a-4c60-b9bf-2fc460bd9666 860 0 2025-12-12 17:25:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 coredns-674b8bbfcf-qhtq2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5aab5a5bdfa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.184 [INFO][4660] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.238 [INFO][4681] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" HandleID="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.239 [INFO][4681] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" HandleID="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb050), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"coredns-674b8bbfcf-qhtq2", "timestamp":"2025-12-12 17:26:44.23872505 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.239 [INFO][4681] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.239 [INFO][4681] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.239 [INFO][4681] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.256 [INFO][4681] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.264 [INFO][4681] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.271 [INFO][4681] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.273 [INFO][4681] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.276 [INFO][4681] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.277 [INFO][4681] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.279 [INFO][4681] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85 Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.284 [INFO][4681] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.294 [INFO][4681] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.199/26] block=192.168.114.192/26 handle="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.295 [INFO][4681] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.199/26] handle="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.295 [INFO][4681] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:44.331530 containerd[1562]: 2025-12-12 17:26:44.295 [INFO][4681] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.199/26] IPv6=[] ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" HandleID="k8s-pod-network.81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Workload="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.300 [INFO][4660] cni-plugin/k8s.go 418: Populated endpoint ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6ebd6282-432a-4c60-b9bf-2fc460bd9666", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"coredns-674b8bbfcf-qhtq2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5aab5a5bdfa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.300 [INFO][4660] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.199/32] ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.300 [INFO][4660] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5aab5a5bdfa ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.302 [INFO][4660] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.302 [INFO][4660] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6ebd6282-432a-4c60-b9bf-2fc460bd9666", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85", Pod:"coredns-674b8bbfcf-qhtq2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5aab5a5bdfa", MAC:"7e:ec:71:e5:54:ee", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:44.333136 containerd[1562]: 2025-12-12 17:26:44.328 [INFO][4660] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" Namespace="kube-system" Pod="coredns-674b8bbfcf-qhtq2" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-coredns--674b8bbfcf--qhtq2-eth0" Dec 12 17:26:44.362113 containerd[1562]: time="2025-12-12T17:26:44.361919737Z" level=info msg="connecting to shim 81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85" address="unix:///run/containerd/s/eaff1a7d516300a5f101f8098b6b7d4e0dace7a6ed52e86df45f373d19ff37bb" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:44.404221 kubelet[2788]: E1212 17:26:44.404141 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:44.418078 systemd[1]: Started cri-containerd-81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85.scope - libcontainer container 81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85. Dec 12 17:26:44.448279 systemd-networkd[1434]: calib6ee4a87981: Link UP Dec 12 17:26:44.449980 systemd-networkd[1434]: calib6ee4a87981: Gained carrier Dec 12 17:26:44.476509 kubelet[2788]: I1212 17:26:44.476435 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vfbql" podStartSLOduration=50.47641542 podStartE2EDuration="50.47641542s" podCreationTimestamp="2025-12-12 17:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:44.474644418 +0000 UTC m=+55.506738223" watchObservedRunningTime="2025-12-12 17:26:44.47641542 +0000 UTC m=+55.508509225" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.174 [INFO][4648] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0 goldmane-666569f655- calico-system b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f 871 0 2025-12-12 17:26:10 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-0-24adfa6772 goldmane-666569f655-mf55c eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib6ee4a87981 [] [] }} ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.174 [INFO][4648] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.244 [INFO][4676] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" HandleID="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Workload="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.244 [INFO][4676] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" HandleID="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Workload="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000120c10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-24adfa6772", "pod":"goldmane-666569f655-mf55c", "timestamp":"2025-12-12 17:26:44.244119021 +0000 UTC"}, Hostname:"ci-4459-2-2-0-24adfa6772", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.244 [INFO][4676] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.295 [INFO][4676] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.295 [INFO][4676] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-24adfa6772' Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.357 [INFO][4676] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.368 [INFO][4676] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.376 [INFO][4676] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.383 [INFO][4676] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.389 [INFO][4676] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.389 [INFO][4676] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.393 [INFO][4676] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.409 [INFO][4676] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.432 [INFO][4676] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.200/26] block=192.168.114.192/26 handle="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.432 [INFO][4676] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.200/26] handle="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" host="ci-4459-2-2-0-24adfa6772" Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.432 [INFO][4676] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:44.488096 containerd[1562]: 2025-12-12 17:26:44.432 [INFO][4676] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.200/26] IPv6=[] ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" HandleID="k8s-pod-network.c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Workload="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.441 [INFO][4648] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"", Pod:"goldmane-666569f655-mf55c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib6ee4a87981", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.441 [INFO][4648] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.200/32] ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.441 [INFO][4648] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib6ee4a87981 ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.451 [INFO][4648] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.452 [INFO][4648] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-24adfa6772", ContainerID:"c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a", Pod:"goldmane-666569f655-mf55c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib6ee4a87981", MAC:"ea:aa:41:18:5a:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:44.491695 containerd[1562]: 2025-12-12 17:26:44.479 [INFO][4648] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" Namespace="calico-system" Pod="goldmane-666569f655-mf55c" WorkloadEndpoint="ci--4459--2--2--0--24adfa6772-k8s-goldmane--666569f655--mf55c-eth0" Dec 12 17:26:44.529549 containerd[1562]: time="2025-12-12T17:26:44.529073828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qhtq2,Uid:6ebd6282-432a-4c60-b9bf-2fc460bd9666,Namespace:kube-system,Attempt:0,} returns sandbox id \"81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85\"" Dec 12 17:26:44.549286 containerd[1562]: time="2025-12-12T17:26:44.549151802Z" level=info msg="CreateContainer within sandbox \"81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:26:44.564897 containerd[1562]: time="2025-12-12T17:26:44.564815730Z" level=info msg="connecting to shim c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a" address="unix:///run/containerd/s/1ab61962be0d0462d1f23e5cc6abf65dd48840016d5f5e76548f2f404569b66f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:44.574816 containerd[1562]: time="2025-12-12T17:26:44.574075360Z" level=info msg="Container 360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:44.589449 containerd[1562]: time="2025-12-12T17:26:44.589394753Z" level=info msg="CreateContainer within sandbox \"81bb3f722c649c07bf205201e795256f7862e30cff9bfda1eb429f6fc2435f85\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de\"" Dec 12 17:26:44.595308 containerd[1562]: time="2025-12-12T17:26:44.595269426Z" level=info msg="StartContainer for \"360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de\"" Dec 12 17:26:44.598033 containerd[1562]: time="2025-12-12T17:26:44.597128392Z" level=info msg="connecting to shim 360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de" address="unix:///run/containerd/s/eaff1a7d516300a5f101f8098b6b7d4e0dace7a6ed52e86df45f373d19ff37bb" protocol=ttrpc version=3 Dec 12 17:26:44.603283 systemd[1]: Started cri-containerd-c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a.scope - libcontainer container c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a. Dec 12 17:26:44.624215 systemd[1]: Started cri-containerd-360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de.scope - libcontainer container 360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de. Dec 12 17:26:44.683591 containerd[1562]: time="2025-12-12T17:26:44.683548130Z" level=info msg="StartContainer for \"360bfaa6a9c9b3172bd105c4875d88f4869ce2406b6bc05d9297682ff4df41de\" returns successfully" Dec 12 17:26:44.703086 containerd[1562]: time="2025-12-12T17:26:44.703011995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mf55c,Uid:b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f,Namespace:calico-system,Attempt:0,} returns sandbox id \"c3051817bd7487a137daab3f00416f2ce9d8e7c19cf2b1d82a45ebd74e19fe5a\"" Dec 12 17:26:44.708448 containerd[1562]: time="2025-12-12T17:26:44.708322801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:26:45.054316 containerd[1562]: time="2025-12-12T17:26:45.054258791Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:45.056810 containerd[1562]: time="2025-12-12T17:26:45.056747544Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:26:45.057577 containerd[1562]: time="2025-12-12T17:26:45.056844868Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:45.057654 kubelet[2788]: E1212 17:26:45.057120 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:45.057654 kubelet[2788]: E1212 17:26:45.057175 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:45.057654 kubelet[2788]: E1212 17:26:45.057318 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n2ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:45.059080 kubelet[2788]: E1212 17:26:45.058987 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:26:45.228242 systemd-networkd[1434]: calice628b17ca9: Gained IPv6LL Dec 12 17:26:45.401921 kubelet[2788]: E1212 17:26:45.401682 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:26:45.435503 kubelet[2788]: I1212 17:26:45.435435 2788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qhtq2" podStartSLOduration=51.435416162 podStartE2EDuration="51.435416162s" podCreationTimestamp="2025-12-12 17:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:45.434588565 +0000 UTC m=+56.466682370" watchObservedRunningTime="2025-12-12 17:26:45.435416162 +0000 UTC m=+56.467510287" Dec 12 17:26:45.484368 systemd-networkd[1434]: cali5aab5a5bdfa: Gained IPv6LL Dec 12 17:26:46.188523 systemd-networkd[1434]: calib6ee4a87981: Gained IPv6LL Dec 12 17:26:46.410200 kubelet[2788]: E1212 17:26:46.409904 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:26:49.109367 containerd[1562]: time="2025-12-12T17:26:49.109050999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:26:49.451643 containerd[1562]: time="2025-12-12T17:26:49.451493760Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:49.452915 containerd[1562]: time="2025-12-12T17:26:49.452846417Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:26:49.453040 containerd[1562]: time="2025-12-12T17:26:49.452975262Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:26:49.453253 kubelet[2788]: E1212 17:26:49.453214 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:49.455104 kubelet[2788]: E1212 17:26:49.453308 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:49.455104 kubelet[2788]: E1212 17:26:49.453454 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c902c75b8da444adacbeeefd6f418d52,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:49.456602 containerd[1562]: time="2025-12-12T17:26:49.456565492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:26:49.786260 containerd[1562]: time="2025-12-12T17:26:49.786113676Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:49.788510 containerd[1562]: time="2025-12-12T17:26:49.788388610Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:26:49.788510 containerd[1562]: time="2025-12-12T17:26:49.788441013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:49.789004 kubelet[2788]: E1212 17:26:49.788908 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:49.789004 kubelet[2788]: E1212 17:26:49.788991 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:49.789371 kubelet[2788]: E1212 17:26:49.789175 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:49.790845 kubelet[2788]: E1212 17:26:49.790615 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:26:55.106187 containerd[1562]: time="2025-12-12T17:26:55.106124773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:26:55.467169 containerd[1562]: time="2025-12-12T17:26:55.467008883Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:55.468887 containerd[1562]: time="2025-12-12T17:26:55.468728786Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:26:55.468887 containerd[1562]: time="2025-12-12T17:26:55.468840031Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:26:55.469133 kubelet[2788]: E1212 17:26:55.469043 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:55.469133 kubelet[2788]: E1212 17:26:55.469124 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:55.469649 kubelet[2788]: E1212 17:26:55.469323 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:55.473318 containerd[1562]: time="2025-12-12T17:26:55.473234033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:26:55.827350 containerd[1562]: time="2025-12-12T17:26:55.827269250Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:55.828892 containerd[1562]: time="2025-12-12T17:26:55.828787146Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:26:55.829013 containerd[1562]: time="2025-12-12T17:26:55.828949312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:26:55.829574 kubelet[2788]: E1212 17:26:55.829454 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:55.829893 kubelet[2788]: E1212 17:26:55.829745 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:55.830476 kubelet[2788]: E1212 17:26:55.830345 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:55.832181 kubelet[2788]: E1212 17:26:55.831958 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:26:56.107024 containerd[1562]: time="2025-12-12T17:26:56.106356307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:26:56.442807 containerd[1562]: time="2025-12-12T17:26:56.442041167Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:56.444566 containerd[1562]: time="2025-12-12T17:26:56.444075280Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:26:56.444566 containerd[1562]: time="2025-12-12T17:26:56.444179164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:56.445437 kubelet[2788]: E1212 17:26:56.444393 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:56.445437 kubelet[2788]: E1212 17:26:56.444437 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:56.445437 kubelet[2788]: E1212 17:26:56.444705 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:56.445687 containerd[1562]: time="2025-12-12T17:26:56.444934271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:56.446237 kubelet[2788]: E1212 17:26:56.446190 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:26:56.778583 containerd[1562]: time="2025-12-12T17:26:56.778172162Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:56.781279 containerd[1562]: time="2025-12-12T17:26:56.780333960Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:56.781279 containerd[1562]: time="2025-12-12T17:26:56.780463845Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:56.781500 kubelet[2788]: E1212 17:26:56.780977 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:56.781500 kubelet[2788]: E1212 17:26:56.781026 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:56.781500 kubelet[2788]: E1212 17:26:56.781173 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdt9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:56.782436 kubelet[2788]: E1212 17:26:56.782350 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:26:57.111947 containerd[1562]: time="2025-12-12T17:26:57.110538626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:57.673329 containerd[1562]: time="2025-12-12T17:26:57.673167181Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:57.674795 containerd[1562]: time="2025-12-12T17:26:57.674716716Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:57.674997 containerd[1562]: time="2025-12-12T17:26:57.674853801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:57.675317 kubelet[2788]: E1212 17:26:57.675184 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:57.675317 kubelet[2788]: E1212 17:26:57.675263 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:57.675646 kubelet[2788]: E1212 17:26:57.675467 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l79zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:57.677713 kubelet[2788]: E1212 17:26:57.677653 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:26:59.109057 containerd[1562]: time="2025-12-12T17:26:59.108825403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:26:59.447566 containerd[1562]: time="2025-12-12T17:26:59.447348480Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:59.449380 containerd[1562]: time="2025-12-12T17:26:59.449292026Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:26:59.449572 containerd[1562]: time="2025-12-12T17:26:59.449431831Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:59.449997 kubelet[2788]: E1212 17:26:59.449850 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:59.451906 kubelet[2788]: E1212 17:26:59.451467 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:59.452066 kubelet[2788]: E1212 17:26:59.451768 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n2ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:59.453771 kubelet[2788]: E1212 17:26:59.453710 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:27:01.109667 kubelet[2788]: E1212 17:27:01.109607 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:27:08.107021 kubelet[2788]: E1212 17:27:08.106739 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:27:10.108136 kubelet[2788]: E1212 17:27:10.108082 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:27:10.110123 kubelet[2788]: E1212 17:27:10.110023 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:27:12.108071 kubelet[2788]: E1212 17:27:12.107662 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:27:12.108071 kubelet[2788]: E1212 17:27:12.107943 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:27:15.109969 containerd[1562]: time="2025-12-12T17:27:15.109434740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:27:15.435374 containerd[1562]: time="2025-12-12T17:27:15.434962949Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:15.436461 containerd[1562]: time="2025-12-12T17:27:15.436274103Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:27:15.436461 containerd[1562]: time="2025-12-12T17:27:15.436377866Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:27:15.436910 kubelet[2788]: E1212 17:27:15.436845 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:15.437308 kubelet[2788]: E1212 17:27:15.436922 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:15.437889 kubelet[2788]: E1212 17:27:15.437672 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c902c75b8da444adacbeeefd6f418d52,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:15.441069 containerd[1562]: time="2025-12-12T17:27:15.441033547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:27:15.778020 containerd[1562]: time="2025-12-12T17:27:15.777967093Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:15.779342 containerd[1562]: time="2025-12-12T17:27:15.779246087Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:27:15.779342 containerd[1562]: time="2025-12-12T17:27:15.779271527Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:15.779714 kubelet[2788]: E1212 17:27:15.779661 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:15.779714 kubelet[2788]: E1212 17:27:15.779715 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:15.779913 kubelet[2788]: E1212 17:27:15.779830 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:15.782118 kubelet[2788]: E1212 17:27:15.782061 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:27:21.108075 containerd[1562]: time="2025-12-12T17:27:21.107735213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:21.465424 containerd[1562]: time="2025-12-12T17:27:21.465053216Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:21.467118 containerd[1562]: time="2025-12-12T17:27:21.466955102Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:21.467118 containerd[1562]: time="2025-12-12T17:27:21.467073344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:21.467437 kubelet[2788]: E1212 17:27:21.467278 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:21.467437 kubelet[2788]: E1212 17:27:21.467345 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:21.467437 kubelet[2788]: E1212 17:27:21.467495 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdt9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:21.468966 kubelet[2788]: E1212 17:27:21.468922 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:27:22.109284 containerd[1562]: time="2025-12-12T17:27:22.108539603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:27:22.441963 containerd[1562]: time="2025-12-12T17:27:22.441110028Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:22.443257 containerd[1562]: time="2025-12-12T17:27:22.443186357Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:27:22.443682 containerd[1562]: time="2025-12-12T17:27:22.443354761Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:27:22.443937 kubelet[2788]: E1212 17:27:22.443792 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:22.444109 kubelet[2788]: E1212 17:27:22.443850 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:22.444580 kubelet[2788]: E1212 17:27:22.444524 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:22.448977 containerd[1562]: time="2025-12-12T17:27:22.448824410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:27:22.799568 containerd[1562]: time="2025-12-12T17:27:22.799382861Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:22.801908 containerd[1562]: time="2025-12-12T17:27:22.800858336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:27:22.802332 containerd[1562]: time="2025-12-12T17:27:22.800899857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:27:22.802879 kubelet[2788]: E1212 17:27:22.802722 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:22.802879 kubelet[2788]: E1212 17:27:22.802811 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:22.803840 kubelet[2788]: E1212 17:27:22.803389 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:22.804806 kubelet[2788]: E1212 17:27:22.804740 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:27:23.109594 containerd[1562]: time="2025-12-12T17:27:23.109212595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:27:23.450980 containerd[1562]: time="2025-12-12T17:27:23.450347158Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:23.453886 containerd[1562]: time="2025-12-12T17:27:23.451930235Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:27:23.453886 containerd[1562]: time="2025-12-12T17:27:23.451968796Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:23.454224 kubelet[2788]: E1212 17:27:23.454185 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:23.454885 kubelet[2788]: E1212 17:27:23.454306 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:23.454885 kubelet[2788]: E1212 17:27:23.454489 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n2ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:23.456219 kubelet[2788]: E1212 17:27:23.456118 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:27:25.108307 containerd[1562]: time="2025-12-12T17:27:25.108261104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:25.454535 containerd[1562]: time="2025-12-12T17:27:25.454372342Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:25.456556 containerd[1562]: time="2025-12-12T17:27:25.456447110Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:25.456556 containerd[1562]: time="2025-12-12T17:27:25.456563872Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:25.457082 kubelet[2788]: E1212 17:27:25.457032 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:25.457652 kubelet[2788]: E1212 17:27:25.457090 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:25.457652 kubelet[2788]: E1212 17:27:25.457222 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l79zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:25.458498 kubelet[2788]: E1212 17:27:25.458457 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:27:27.106799 containerd[1562]: time="2025-12-12T17:27:27.106743101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:27:27.444054 containerd[1562]: time="2025-12-12T17:27:27.443895511Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:27.445908 containerd[1562]: time="2025-12-12T17:27:27.445819434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:27.446040 containerd[1562]: time="2025-12-12T17:27:27.445692151Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:27:27.446493 kubelet[2788]: E1212 17:27:27.446397 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:27.446493 kubelet[2788]: E1212 17:27:27.446478 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:27.448252 kubelet[2788]: E1212 17:27:27.447059 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:27.449329 kubelet[2788]: E1212 17:27:27.449267 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:27:30.108270 kubelet[2788]: E1212 17:27:30.108197 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:27:32.106107 kubelet[2788]: E1212 17:27:32.106051 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:27:34.107673 kubelet[2788]: E1212 17:27:34.107627 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:27:34.109219 kubelet[2788]: E1212 17:27:34.109160 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:27:41.108051 kubelet[2788]: E1212 17:27:41.107613 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:27:42.113757 kubelet[2788]: E1212 17:27:42.113442 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:27:43.107317 kubelet[2788]: E1212 17:27:43.107191 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:27:45.109669 kubelet[2788]: E1212 17:27:45.109594 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:27:45.112119 kubelet[2788]: E1212 17:27:45.111444 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:27:48.108557 kubelet[2788]: E1212 17:27:48.108276 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:27:53.108039 kubelet[2788]: E1212 17:27:53.107689 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:27:55.107830 kubelet[2788]: E1212 17:27:55.107756 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:27:56.107536 kubelet[2788]: E1212 17:27:56.107482 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:27:56.109791 containerd[1562]: time="2025-12-12T17:27:56.109681161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:27:56.434725 containerd[1562]: time="2025-12-12T17:27:56.434154585Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:56.437024 containerd[1562]: time="2025-12-12T17:27:56.436938473Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:27:56.437999 containerd[1562]: time="2025-12-12T17:27:56.436971393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:27:56.438244 kubelet[2788]: E1212 17:27:56.438184 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:56.438750 kubelet[2788]: E1212 17:27:56.438250 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:56.438750 kubelet[2788]: E1212 17:27:56.438382 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c902c75b8da444adacbeeefd6f418d52,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:56.441153 containerd[1562]: time="2025-12-12T17:27:56.441070824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:27:56.797471 containerd[1562]: time="2025-12-12T17:27:56.797284474Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:56.798527 containerd[1562]: time="2025-12-12T17:27:56.798426854Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:27:56.798527 containerd[1562]: time="2025-12-12T17:27:56.798495255Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:56.798830 kubelet[2788]: E1212 17:27:56.798792 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:56.798992 kubelet[2788]: E1212 17:27:56.798935 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:56.800490 kubelet[2788]: E1212 17:27:56.799533 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:56.802898 kubelet[2788]: E1212 17:27:56.801829 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:27:58.106483 kubelet[2788]: E1212 17:27:58.106394 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:28:02.105457 kubelet[2788]: E1212 17:28:02.105391 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:28:07.109235 kubelet[2788]: E1212 17:28:07.108634 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:28:09.112200 containerd[1562]: time="2025-12-12T17:28:09.111513770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:09.471373 containerd[1562]: time="2025-12-12T17:28:09.470932593Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:09.472845 containerd[1562]: time="2025-12-12T17:28:09.472719182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:09.472845 containerd[1562]: time="2025-12-12T17:28:09.472824263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:09.473030 kubelet[2788]: E1212 17:28:09.472981 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:09.473992 kubelet[2788]: E1212 17:28:09.473914 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:09.475017 kubelet[2788]: E1212 17:28:09.474949 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdt9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:09.476333 kubelet[2788]: E1212 17:28:09.476249 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:28:10.107938 containerd[1562]: time="2025-12-12T17:28:10.107131181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:10.460917 containerd[1562]: time="2025-12-12T17:28:10.460449601Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:10.464643 containerd[1562]: time="2025-12-12T17:28:10.464376544Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:10.464643 containerd[1562]: time="2025-12-12T17:28:10.464429545Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:10.464815 kubelet[2788]: E1212 17:28:10.464693 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:10.464815 kubelet[2788]: E1212 17:28:10.464744 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:10.465932 kubelet[2788]: E1212 17:28:10.464943 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l79zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:10.466215 kubelet[2788]: E1212 17:28:10.466177 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:28:11.108678 kubelet[2788]: E1212 17:28:11.108599 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:28:12.106876 containerd[1562]: time="2025-12-12T17:28:12.106788896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:28:12.447903 containerd[1562]: time="2025-12-12T17:28:12.447769154Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:12.449620 containerd[1562]: time="2025-12-12T17:28:12.449566542Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:28:12.449739 containerd[1562]: time="2025-12-12T17:28:12.449655104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:28:12.451096 kubelet[2788]: E1212 17:28:12.451031 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:12.451549 kubelet[2788]: E1212 17:28:12.451105 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:12.451549 kubelet[2788]: E1212 17:28:12.451227 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:12.455594 containerd[1562]: time="2025-12-12T17:28:12.455556837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:28:12.808218 containerd[1562]: time="2025-12-12T17:28:12.808158840Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:12.809714 containerd[1562]: time="2025-12-12T17:28:12.809641784Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:28:12.809815 containerd[1562]: time="2025-12-12T17:28:12.809759145Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:28:12.810104 kubelet[2788]: E1212 17:28:12.810044 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:12.810176 kubelet[2788]: E1212 17:28:12.810111 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:12.810601 kubelet[2788]: E1212 17:28:12.810544 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:12.811824 kubelet[2788]: E1212 17:28:12.811770 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:28:14.107891 containerd[1562]: time="2025-12-12T17:28:14.107379248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:28:14.456975 containerd[1562]: time="2025-12-12T17:28:14.456782437Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:14.458524 containerd[1562]: time="2025-12-12T17:28:14.458473384Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:28:14.458658 containerd[1562]: time="2025-12-12T17:28:14.458583466Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:14.458753 kubelet[2788]: E1212 17:28:14.458713 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:14.459096 kubelet[2788]: E1212 17:28:14.458765 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:14.459421 kubelet[2788]: E1212 17:28:14.459326 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n2ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:14.460966 kubelet[2788]: E1212 17:28:14.460911 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:28:20.106316 kubelet[2788]: E1212 17:28:20.106133 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:28:21.109045 containerd[1562]: time="2025-12-12T17:28:21.108997154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:28:21.495024 containerd[1562]: time="2025-12-12T17:28:21.494831494Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:21.498495 containerd[1562]: time="2025-12-12T17:28:21.498407989Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:28:21.498619 containerd[1562]: time="2025-12-12T17:28:21.498573991Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:28:21.500419 kubelet[2788]: E1212 17:28:21.500061 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:21.500419 kubelet[2788]: E1212 17:28:21.500116 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:21.500419 kubelet[2788]: E1212 17:28:21.500264 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-8f9f7d5ff-pj77s_calico-system(c7baeefe-1ae7-4ac7-a668-035dfb7baaef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:21.501562 kubelet[2788]: E1212 17:28:21.501529 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:28:23.107824 kubelet[2788]: E1212 17:28:23.107644 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:28:24.111273 kubelet[2788]: E1212 17:28:24.111097 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:28:24.362272 systemd[1]: Started sshd@7-46.224.132.113:22-139.178.89.65:56436.service - OpenSSH per-connection server daemon (139.178.89.65:56436). Dec 12 17:28:25.355285 sshd[4992]: Accepted publickey for core from 139.178.89.65 port 56436 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:25.358438 sshd-session[4992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:25.367947 systemd-logind[1523]: New session 8 of user core. Dec 12 17:28:25.372095 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:28:26.107132 kubelet[2788]: E1212 17:28:26.107073 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:28:26.143268 sshd[4997]: Connection closed by 139.178.89.65 port 56436 Dec 12 17:28:26.144400 sshd-session[4992]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:26.151666 systemd-logind[1523]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:28:26.152417 systemd[1]: sshd@7-46.224.132.113:22-139.178.89.65:56436.service: Deactivated successfully. Dec 12 17:28:26.157915 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:28:26.162456 systemd-logind[1523]: Removed session 8. Dec 12 17:28:30.106478 kubelet[2788]: E1212 17:28:30.106382 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:28:31.318402 systemd[1]: Started sshd@8-46.224.132.113:22-139.178.89.65:58378.service - OpenSSH per-connection server daemon (139.178.89.65:58378). Dec 12 17:28:32.319659 sshd[5012]: Accepted publickey for core from 139.178.89.65 port 58378 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:32.321782 sshd-session[5012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:32.328205 systemd-logind[1523]: New session 9 of user core. Dec 12 17:28:32.333141 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:28:33.082310 sshd[5015]: Connection closed by 139.178.89.65 port 58378 Dec 12 17:28:33.083087 sshd-session[5012]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:33.092451 systemd[1]: sshd@8-46.224.132.113:22-139.178.89.65:58378.service: Deactivated successfully. Dec 12 17:28:33.094317 systemd-logind[1523]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:28:33.096281 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:28:33.099625 systemd-logind[1523]: Removed session 9. Dec 12 17:28:33.256259 systemd[1]: Started sshd@9-46.224.132.113:22-139.178.89.65:58394.service - OpenSSH per-connection server daemon (139.178.89.65:58394). Dec 12 17:28:34.107657 kubelet[2788]: E1212 17:28:34.106427 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:28:34.260997 sshd[5029]: Accepted publickey for core from 139.178.89.65 port 58394 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:34.264535 sshd-session[5029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:34.273153 systemd-logind[1523]: New session 10 of user core. Dec 12 17:28:34.278201 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:28:35.111832 kubelet[2788]: E1212 17:28:35.111332 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:28:35.111832 kubelet[2788]: E1212 17:28:35.111768 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:28:35.114484 kubelet[2788]: E1212 17:28:35.114015 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:28:35.121356 sshd[5033]: Connection closed by 139.178.89.65 port 58394 Dec 12 17:28:35.121949 sshd-session[5029]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:35.132643 systemd[1]: sshd@9-46.224.132.113:22-139.178.89.65:58394.service: Deactivated successfully. Dec 12 17:28:35.138788 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:28:35.141877 systemd-logind[1523]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:28:35.147996 systemd-logind[1523]: Removed session 10. Dec 12 17:28:35.288558 systemd[1]: Started sshd@10-46.224.132.113:22-139.178.89.65:58410.service - OpenSSH per-connection server daemon (139.178.89.65:58410). Dec 12 17:28:36.278976 sshd[5043]: Accepted publickey for core from 139.178.89.65 port 58410 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:36.281655 sshd-session[5043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:36.289341 systemd-logind[1523]: New session 11 of user core. Dec 12 17:28:36.296082 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:28:37.091669 sshd[5071]: Connection closed by 139.178.89.65 port 58410 Dec 12 17:28:37.093240 sshd-session[5043]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:37.098460 systemd-logind[1523]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:28:37.099345 systemd[1]: sshd@10-46.224.132.113:22-139.178.89.65:58410.service: Deactivated successfully. Dec 12 17:28:37.103329 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:28:37.109706 systemd-logind[1523]: Removed session 11. Dec 12 17:28:39.110897 kubelet[2788]: E1212 17:28:39.110745 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:28:42.107767 kubelet[2788]: E1212 17:28:42.107718 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:28:42.261918 systemd[1]: Started sshd@11-46.224.132.113:22-139.178.89.65:44774.service - OpenSSH per-connection server daemon (139.178.89.65:44774). Dec 12 17:28:43.250915 sshd[5086]: Accepted publickey for core from 139.178.89.65 port 44774 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:43.253440 sshd-session[5086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:43.263001 systemd-logind[1523]: New session 12 of user core. Dec 12 17:28:43.266100 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:28:44.049462 sshd[5089]: Connection closed by 139.178.89.65 port 44774 Dec 12 17:28:44.050130 sshd-session[5086]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:44.056307 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:28:44.058436 systemd[1]: sshd@11-46.224.132.113:22-139.178.89.65:44774.service: Deactivated successfully. Dec 12 17:28:44.066347 systemd-logind[1523]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:28:44.069816 systemd-logind[1523]: Removed session 12. Dec 12 17:28:47.106298 kubelet[2788]: E1212 17:28:47.106238 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:28:47.107506 kubelet[2788]: E1212 17:28:47.107405 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:28:49.108784 kubelet[2788]: E1212 17:28:49.108736 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:28:49.110232 kubelet[2788]: E1212 17:28:49.109753 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:28:49.229171 systemd[1]: Started sshd@12-46.224.132.113:22-139.178.89.65:44788.service - OpenSSH per-connection server daemon (139.178.89.65:44788). Dec 12 17:28:50.230011 sshd[5103]: Accepted publickey for core from 139.178.89.65 port 44788 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:50.232633 sshd-session[5103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:50.241961 systemd-logind[1523]: New session 13 of user core. Dec 12 17:28:50.247067 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:28:50.997180 sshd[5106]: Connection closed by 139.178.89.65 port 44788 Dec 12 17:28:50.995458 sshd-session[5103]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:51.001542 systemd-logind[1523]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:28:51.003530 systemd[1]: sshd@12-46.224.132.113:22-139.178.89.65:44788.service: Deactivated successfully. Dec 12 17:28:51.006832 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:28:51.013267 systemd-logind[1523]: Removed session 13. Dec 12 17:28:51.108015 kubelet[2788]: E1212 17:28:51.107930 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:28:56.165129 systemd[1]: Started sshd@13-46.224.132.113:22-139.178.89.65:42902.service - OpenSSH per-connection server daemon (139.178.89.65:42902). Dec 12 17:28:57.106421 kubelet[2788]: E1212 17:28:57.106355 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:28:57.135639 sshd[5120]: Accepted publickey for core from 139.178.89.65 port 42902 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:57.138319 sshd-session[5120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:57.147965 systemd-logind[1523]: New session 14 of user core. Dec 12 17:28:57.152892 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:28:57.929645 sshd[5123]: Connection closed by 139.178.89.65 port 42902 Dec 12 17:28:57.931091 sshd-session[5120]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:57.935921 systemd-logind[1523]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:28:57.936324 systemd[1]: sshd@13-46.224.132.113:22-139.178.89.65:42902.service: Deactivated successfully. Dec 12 17:28:57.939009 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:28:57.947324 systemd-logind[1523]: Removed session 14. Dec 12 17:28:58.099220 systemd[1]: Started sshd@14-46.224.132.113:22-139.178.89.65:42904.service - OpenSSH per-connection server daemon (139.178.89.65:42904). Dec 12 17:28:59.075362 sshd[5134]: Accepted publickey for core from 139.178.89.65 port 42904 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:59.078208 sshd-session[5134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:59.086677 systemd-logind[1523]: New session 15 of user core. Dec 12 17:28:59.090165 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:29:00.037484 sshd[5137]: Connection closed by 139.178.89.65 port 42904 Dec 12 17:29:00.038150 sshd-session[5134]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:00.043326 systemd[1]: sshd@14-46.224.132.113:22-139.178.89.65:42904.service: Deactivated successfully. Dec 12 17:29:00.049671 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:29:00.052426 systemd-logind[1523]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:29:00.054764 systemd-logind[1523]: Removed session 15. Dec 12 17:29:00.211980 systemd[1]: Started sshd@15-46.224.132.113:22-139.178.89.65:42906.service - OpenSSH per-connection server daemon (139.178.89.65:42906). Dec 12 17:29:01.200516 sshd[5147]: Accepted publickey for core from 139.178.89.65 port 42906 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:01.202472 sshd-session[5147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:01.209756 systemd-logind[1523]: New session 16 of user core. Dec 12 17:29:01.217101 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:29:02.106648 kubelet[2788]: E1212 17:29:02.106569 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:29:02.108214 kubelet[2788]: E1212 17:29:02.107580 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:29:02.108879 kubelet[2788]: E1212 17:29:02.108466 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:29:02.109222 kubelet[2788]: E1212 17:29:02.109122 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:29:02.715773 sshd[5150]: Connection closed by 139.178.89.65 port 42906 Dec 12 17:29:02.715650 sshd-session[5147]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:02.721657 systemd[1]: sshd@15-46.224.132.113:22-139.178.89.65:42906.service: Deactivated successfully. Dec 12 17:29:02.725768 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:29:02.729811 systemd-logind[1523]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:29:02.732036 systemd-logind[1523]: Removed session 16. Dec 12 17:29:02.884165 systemd[1]: Started sshd@16-46.224.132.113:22-139.178.89.65:57526.service - OpenSSH per-connection server daemon (139.178.89.65:57526). Dec 12 17:29:03.880691 sshd[5169]: Accepted publickey for core from 139.178.89.65 port 57526 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:03.884523 sshd-session[5169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:03.891733 systemd-logind[1523]: New session 17 of user core. Dec 12 17:29:03.901139 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:29:04.822056 sshd[5172]: Connection closed by 139.178.89.65 port 57526 Dec 12 17:29:04.822979 sshd-session[5169]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:04.830054 systemd[1]: sshd@16-46.224.132.113:22-139.178.89.65:57526.service: Deactivated successfully. Dec 12 17:29:04.835146 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:29:04.836827 systemd-logind[1523]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:29:04.838456 systemd-logind[1523]: Removed session 17. Dec 12 17:29:04.995634 systemd[1]: Started sshd@17-46.224.132.113:22-139.178.89.65:57538.service - OpenSSH per-connection server daemon (139.178.89.65:57538). Dec 12 17:29:05.982344 sshd[5182]: Accepted publickey for core from 139.178.89.65 port 57538 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:05.984316 sshd-session[5182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:05.990286 systemd-logind[1523]: New session 18 of user core. Dec 12 17:29:05.997129 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:29:06.106113 kubelet[2788]: E1212 17:29:06.106030 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:29:06.731745 sshd[5209]: Connection closed by 139.178.89.65 port 57538 Dec 12 17:29:06.734180 sshd-session[5182]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:06.739582 systemd[1]: sshd@17-46.224.132.113:22-139.178.89.65:57538.service: Deactivated successfully. Dec 12 17:29:06.743536 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:29:06.745320 systemd-logind[1523]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:29:06.747578 systemd-logind[1523]: Removed session 18. Dec 12 17:29:09.107806 kubelet[2788]: E1212 17:29:09.107649 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:29:11.907207 systemd[1]: Started sshd@18-46.224.132.113:22-139.178.89.65:51312.service - OpenSSH per-connection server daemon (139.178.89.65:51312). Dec 12 17:29:12.902496 sshd[5223]: Accepted publickey for core from 139.178.89.65 port 51312 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:12.903471 sshd-session[5223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:12.910312 systemd-logind[1523]: New session 19 of user core. Dec 12 17:29:12.915142 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:29:13.107046 kubelet[2788]: E1212 17:29:13.106393 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:29:13.659510 sshd[5226]: Connection closed by 139.178.89.65 port 51312 Dec 12 17:29:13.660352 sshd-session[5223]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:13.666702 systemd[1]: sshd@18-46.224.132.113:22-139.178.89.65:51312.service: Deactivated successfully. Dec 12 17:29:13.671756 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:29:13.677108 systemd-logind[1523]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:29:13.681187 systemd-logind[1523]: Removed session 19. Dec 12 17:29:16.107372 kubelet[2788]: E1212 17:29:16.107278 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:29:16.108123 kubelet[2788]: E1212 17:29:16.107454 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:29:17.111673 kubelet[2788]: E1212 17:29:17.111307 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:29:18.841158 systemd[1]: Started sshd@19-46.224.132.113:22-139.178.89.65:51342.service - OpenSSH per-connection server daemon (139.178.89.65:51342). Dec 12 17:29:19.895153 sshd[5246]: Accepted publickey for core from 139.178.89.65 port 51342 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:19.898446 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:19.905339 systemd-logind[1523]: New session 20 of user core. Dec 12 17:29:19.911134 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:29:20.738055 sshd[5249]: Connection closed by 139.178.89.65 port 51342 Dec 12 17:29:20.738570 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:20.746258 systemd-logind[1523]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:29:20.747105 systemd[1]: sshd@19-46.224.132.113:22-139.178.89.65:51342.service: Deactivated successfully. Dec 12 17:29:20.750160 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:29:20.752631 systemd-logind[1523]: Removed session 20. Dec 12 17:29:21.106646 kubelet[2788]: E1212 17:29:21.106558 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:29:21.113044 containerd[1562]: time="2025-12-12T17:29:21.112989878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:29:21.444028 containerd[1562]: time="2025-12-12T17:29:21.443846950Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:21.445388 containerd[1562]: time="2025-12-12T17:29:21.445325650Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:29:21.445388 containerd[1562]: time="2025-12-12T17:29:21.445350771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:29:21.445654 kubelet[2788]: E1212 17:29:21.445570 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:21.445654 kubelet[2788]: E1212 17:29:21.445615 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:21.453947 kubelet[2788]: E1212 17:29:21.453847 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c902c75b8da444adacbeeefd6f418d52,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:21.456446 containerd[1562]: time="2025-12-12T17:29:21.456402247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:29:21.799209 containerd[1562]: time="2025-12-12T17:29:21.799148846Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:21.800735 containerd[1562]: time="2025-12-12T17:29:21.800639227Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:29:21.800850 containerd[1562]: time="2025-12-12T17:29:21.800768509Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:21.802256 kubelet[2788]: E1212 17:29:21.801322 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:21.802256 kubelet[2788]: E1212 17:29:21.802018 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:21.802256 kubelet[2788]: E1212 17:29:21.802161 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rbr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cf75855f7-k7v95_calico-system(43e43bdb-6d56-4012-94dd-d2daf57ba1c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:21.803900 kubelet[2788]: E1212 17:29:21.803752 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:29:26.106456 kubelet[2788]: E1212 17:29:26.106360 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:29:27.105819 kubelet[2788]: E1212 17:29:27.105651 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:29:28.105796 kubelet[2788]: E1212 17:29:28.105709 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:29:28.107201 kubelet[2788]: E1212 17:29:28.107091 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37" Dec 12 17:29:34.108439 kubelet[2788]: E1212 17:29:34.108270 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cf75855f7-k7v95" podUID="43e43bdb-6d56-4012-94dd-d2daf57ba1c9" Dec 12 17:29:35.106502 containerd[1562]: time="2025-12-12T17:29:35.106353229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:29:35.458594 containerd[1562]: time="2025-12-12T17:29:35.458285650Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:35.460759 containerd[1562]: time="2025-12-12T17:29:35.460697964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:29:35.461122 containerd[1562]: time="2025-12-12T17:29:35.460920407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:35.461650 kubelet[2788]: E1212 17:29:35.461330 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:35.461650 kubelet[2788]: E1212 17:29:35.461390 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:35.461650 kubelet[2788]: E1212 17:29:35.461568 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n2ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mf55c_calico-system(b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:35.462918 kubelet[2788]: E1212 17:29:35.462837 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mf55c" podUID="b2a6e3f8-60c3-41cd-b9ef-8e00c63a997f" Dec 12 17:29:35.602406 kubelet[2788]: E1212 17:29:35.602330 2788 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38820->10.0.0.2:2379: read: connection timed out" Dec 12 17:29:35.609085 systemd[1]: cri-containerd-a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c.scope: Deactivated successfully. Dec 12 17:29:35.609401 systemd[1]: cri-containerd-a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c.scope: Consumed 3.312s CPU time, 24.6M memory peak, 3.6M read from disk. Dec 12 17:29:35.613259 containerd[1562]: time="2025-12-12T17:29:35.612845540Z" level=info msg="received container exit event container_id:\"a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c\" id:\"a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c\" pid:2651 exit_status:1 exited_at:{seconds:1765560575 nanos:612483735}" Dec 12 17:29:35.641650 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c-rootfs.mount: Deactivated successfully. Dec 12 17:29:35.796674 kubelet[2788]: E1212 17:29:35.796278 2788 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38638->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-74c4865b69-4lwrd.188087d4a931db3b calico-apiserver 1770 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-74c4865b69-4lwrd,UID:2598be35-bdd7-4b8f-994d-8273d0db5ae9,APIVersion:v1,ResourceVersion:854,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-0-24adfa6772,},FirstTimestamp:2025-12-12 17:26:41 +0000 UTC,LastTimestamp:2025-12-12 17:29:26.106269583 +0000 UTC m=+217.138363388,Count:11,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-0-24adfa6772,}" Dec 12 17:29:35.901419 systemd[1]: cri-containerd-eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49.scope: Deactivated successfully. Dec 12 17:29:35.903275 systemd[1]: cri-containerd-eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49.scope: Consumed 39.207s CPU time, 116.7M memory peak. Dec 12 17:29:35.906007 containerd[1562]: time="2025-12-12T17:29:35.905848493Z" level=info msg="received container exit event container_id:\"eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49\" id:\"eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49\" pid:3112 exit_status:1 exited_at:{seconds:1765560575 nanos:905498128}" Dec 12 17:29:35.939245 kubelet[2788]: I1212 17:29:35.939163 2788 scope.go:117] "RemoveContainer" containerID="a8d957a545236303dbe09a7bda25224eb83cd1be5aa27cf85a8c7c47a587c78c" Dec 12 17:29:35.939449 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49-rootfs.mount: Deactivated successfully. Dec 12 17:29:35.952306 containerd[1562]: time="2025-12-12T17:29:35.952258664Z" level=info msg="CreateContainer within sandbox \"66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 17:29:35.966085 containerd[1562]: time="2025-12-12T17:29:35.966037618Z" level=info msg="Container 63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:35.970597 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3208706117.mount: Deactivated successfully. Dec 12 17:29:35.979577 containerd[1562]: time="2025-12-12T17:29:35.979530367Z" level=info msg="CreateContainer within sandbox \"66db16d681c611875d727380fff0cf00dd0d8b798ca00c2234c5ee031b952b58\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d\"" Dec 12 17:29:35.980250 containerd[1562]: time="2025-12-12T17:29:35.980224377Z" level=info msg="StartContainer for \"63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d\"" Dec 12 17:29:35.981886 containerd[1562]: time="2025-12-12T17:29:35.981634597Z" level=info msg="connecting to shim 63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d" address="unix:///run/containerd/s/35d8e11efdb9e137958ace600bd84c23d885c8ad17f33b0f11f55af5270302e1" protocol=ttrpc version=3 Dec 12 17:29:36.002136 systemd[1]: Started cri-containerd-63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d.scope - libcontainer container 63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d. Dec 12 17:29:36.051146 containerd[1562]: time="2025-12-12T17:29:36.050581204Z" level=info msg="StartContainer for \"63b7c66e891a7ec8046ee428f97045a9651c1d8d7f6b45768c1510d4e9c8d48d\" returns successfully" Dec 12 17:29:36.946480 kubelet[2788]: I1212 17:29:36.945965 2788 scope.go:117] "RemoveContainer" containerID="eb26b6a3c3ea543c7f716a8a0daac4c8fdce335945eff23c0ee42cb3a0cedb49" Dec 12 17:29:36.948797 containerd[1562]: time="2025-12-12T17:29:36.948728489Z" level=info msg="CreateContainer within sandbox \"0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 17:29:36.964898 containerd[1562]: time="2025-12-12T17:29:36.963749979Z" level=info msg="Container e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:36.967644 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3564598423.mount: Deactivated successfully. Dec 12 17:29:36.977932 containerd[1562]: time="2025-12-12T17:29:36.977893658Z" level=info msg="CreateContainer within sandbox \"0a89e4ea8f2b879dd3816bdcf9dc6c7cb56256747102bed9d270e23530b49f23\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924\"" Dec 12 17:29:36.978562 containerd[1562]: time="2025-12-12T17:29:36.978528747Z" level=info msg="StartContainer for \"e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924\"" Dec 12 17:29:36.979856 containerd[1562]: time="2025-12-12T17:29:36.979751684Z" level=info msg="connecting to shim e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924" address="unix:///run/containerd/s/26cb6ef1cec77fd5d8571867ad212ac16219331571f135ec2574cf455b244f95" protocol=ttrpc version=3 Dec 12 17:29:37.008129 systemd[1]: Started cri-containerd-e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924.scope - libcontainer container e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924. Dec 12 17:29:37.053606 containerd[1562]: time="2025-12-12T17:29:37.053563480Z" level=info msg="StartContainer for \"e37cacd1c6edb3e6db1239d4b1c209851e91ba79d13101a49c99b7e7fdac2924\" returns successfully" Dec 12 17:29:37.084589 systemd[1]: cri-containerd-141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc.scope: Deactivated successfully. Dec 12 17:29:37.085127 systemd[1]: cri-containerd-141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc.scope: Consumed 5.201s CPU time, 62.9M memory peak, 4.1M read from disk. Dec 12 17:29:37.090638 containerd[1562]: time="2025-12-12T17:29:37.090417637Z" level=info msg="received container exit event container_id:\"141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc\" id:\"141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc\" pid:2626 exit_status:1 exited_at:{seconds:1765560577 nanos:88011923}" Dec 12 17:29:37.139991 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc-rootfs.mount: Deactivated successfully. Dec 12 17:29:37.960748 kubelet[2788]: I1212 17:29:37.960257 2788 scope.go:117] "RemoveContainer" containerID="141bfb998dfec7bf5820e408aaf2856e330d2c0d81a57d59f00c6c4ed67f9cbc" Dec 12 17:29:37.963695 containerd[1562]: time="2025-12-12T17:29:37.963484245Z" level=info msg="CreateContainer within sandbox \"a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 17:29:37.983397 containerd[1562]: time="2025-12-12T17:29:37.981275815Z" level=info msg="Container 7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:37.984999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2614218600.mount: Deactivated successfully. Dec 12 17:29:37.994597 containerd[1562]: time="2025-12-12T17:29:37.994527361Z" level=info msg="CreateContainer within sandbox \"a43c6af349d2977b47896da381fc2905b45cc17823e4af24856bc18bccf43796\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35\"" Dec 12 17:29:37.996288 containerd[1562]: time="2025-12-12T17:29:37.995588136Z" level=info msg="StartContainer for \"7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35\"" Dec 12 17:29:37.997505 containerd[1562]: time="2025-12-12T17:29:37.997468882Z" level=info msg="connecting to shim 7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35" address="unix:///run/containerd/s/f5a5b54b7ce646a0c511cfefd1dd137a3bc94e9ca97bab77ce392b893a1b22ad" protocol=ttrpc version=3 Dec 12 17:29:38.041231 systemd[1]: Started cri-containerd-7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35.scope - libcontainer container 7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35. Dec 12 17:29:38.095687 containerd[1562]: time="2025-12-12T17:29:38.095536497Z" level=info msg="StartContainer for \"7fd74a7ed3471ca17001a0625d8e5eae11974fabc7778e28ce0e1f3385a26a35\" returns successfully" Dec 12 17:29:38.107366 containerd[1562]: time="2025-12-12T17:29:38.107306503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:38.489420 containerd[1562]: time="2025-12-12T17:29:38.488699252Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:38.490385 containerd[1562]: time="2025-12-12T17:29:38.490343715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:38.490951 containerd[1562]: time="2025-12-12T17:29:38.490475917Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:38.491427 kubelet[2788]: E1212 17:29:38.491096 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:38.491427 kubelet[2788]: E1212 17:29:38.491147 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:38.491427 kubelet[2788]: E1212 17:29:38.491348 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdt9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-rq46q_calico-apiserver(96b8454b-22d4-4695-8613-4bfabbdf8fc4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:38.492643 containerd[1562]: time="2025-12-12T17:29:38.492620747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:38.493015 kubelet[2788]: E1212 17:29:38.492952 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-rq46q" podUID="96b8454b-22d4-4695-8613-4bfabbdf8fc4" Dec 12 17:29:38.830402 containerd[1562]: time="2025-12-12T17:29:38.830324963Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:38.831972 containerd[1562]: time="2025-12-12T17:29:38.831855904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:38.832556 containerd[1562]: time="2025-12-12T17:29:38.831921665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:38.832629 kubelet[2788]: E1212 17:29:38.832311 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:38.832629 kubelet[2788]: E1212 17:29:38.832359 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:38.832629 kubelet[2788]: E1212 17:29:38.832494 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l79zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-74c4865b69-4lwrd_calico-apiserver(2598be35-bdd7-4b8f-994d-8273d0db5ae9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:38.833762 kubelet[2788]: E1212 17:29:38.833702 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-74c4865b69-4lwrd" podUID="2598be35-bdd7-4b8f-994d-8273d0db5ae9" Dec 12 17:29:39.106814 kubelet[2788]: E1212 17:29:39.106669 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8f9f7d5ff-pj77s" podUID="c7baeefe-1ae7-4ac7-a668-035dfb7baaef" Dec 12 17:29:40.106417 containerd[1562]: time="2025-12-12T17:29:40.106330414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:29:40.462361 containerd[1562]: time="2025-12-12T17:29:40.461899638Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:40.464447 containerd[1562]: time="2025-12-12T17:29:40.464314712Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:29:40.464447 containerd[1562]: time="2025-12-12T17:29:40.464382393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:29:40.464896 kubelet[2788]: E1212 17:29:40.464832 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:40.465490 kubelet[2788]: E1212 17:29:40.465261 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:40.465490 kubelet[2788]: E1212 17:29:40.465420 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:40.467642 containerd[1562]: time="2025-12-12T17:29:40.467481437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:29:40.821437 containerd[1562]: time="2025-12-12T17:29:40.821368277Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:40.822858 containerd[1562]: time="2025-12-12T17:29:40.822757176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:29:40.822978 containerd[1562]: time="2025-12-12T17:29:40.822949779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:29:40.823195 kubelet[2788]: E1212 17:29:40.823146 2788 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:40.823385 kubelet[2788]: E1212 17:29:40.823298 2788 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:40.823770 kubelet[2788]: E1212 17:29:40.823690 2788 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lrv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-qj8f4_calico-system(b3554d30-e274-4b18-8389-696d2cc03c37): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:40.825050 kubelet[2788]: E1212 17:29:40.824994 2788 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-qj8f4" podUID="b3554d30-e274-4b18-8389-696d2cc03c37"